-
Notifications
You must be signed in to change notification settings - Fork 17.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
testing: reporting non-standard benchmark results #16110
Comments
Interesting. cc @aclements @mpvl for thoughts. Tentatively marking this Go 1.8. |
What is the API you are looking for? |
ping @benburkert |
I think it would mostly consist of a |
Oh, this isn't at all what I thought you were proposing. I thought you wanted custom metrics (e.g., metrics other than the standard ns/op, allocs/op, etc). That certainly what I want. :) Can you explain why sub-benchmarks don't already solve this? |
That would be a nice ability as well. For my immediate need i'm using the sub-benchmark's name to specify the custom metric:
But an API for specifying a custom unit would be preferable:
Which could be done with an extra method on
I'm building on sub-benchmarks but using the reflection package to get at name and iteration values. And even with reflection there's no way to get at the |
If I understand correctly, it seems like what you really want is a way of reporting custom metrics and doing this through sub-benchmarks/extra benchmark lines is just a hack to get around the lack of custom metrics. If so, we should add custom metrics, not add mechanism to support the hack. |
yes, that's what i'm wanting, sorry for the confusion. |
Closing this in favor of #26037, which covers the same ground but appears to be moving forward. |
I would like to report the results of non-standard benchmark (non google/benchmark style) along side existing benchmarks, and do so following the proposed Go benchmark data format. For example, the pbench package reports percentiles in addition to the standard results.
I was unable to implement this benchmark without using reflection to access some unexported fields, and there was no mechanism for using the same
io.Writer
for the output. A preferable solution would be a new method onB
that, provided a name & benchmark-result argument, formats and writes the results in the standard format.Additionally, adding a
ReportAllocs
field toBenchmarkResults
would be useful for when a benchmark is run with-benchmem
but a non-standard benchmark does not support malloc statistics.The text was updated successfully, but these errors were encountered: