-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Percentiles rounding error #331
Comments
I noticed this as well. Seems to be deliberate: locust/stats.py#L160 The numbers are accurate enough for my purposes, so no complaint here |
@jalan 99% > 100% is confusing. Perhaps the 100th percentile should be rounded also? |
@cgoldberg: Agreed. One option would be to round the min and max times just like all the other times: diff --git a/locust/stats.py b/locust/stats.py
index 267dee4..79d71a5 100644
--- a/locust/stats.py
+++ b/locust/stats.py
@@ -151,12 +151,6 @@ class StatsEntry(object):
def _log_response_time(self, response_time):
self.total_response_time += response_time
- if self.min_response_time is None:
- self.min_response_time = response_time
-
- self.min_response_time = min(self.min_response_time, response_time)
- self.max_response_time = max(self.max_response_time, response_time)
-
# to avoid to much data that has to be transfered to the master node when
# running in distributed mode, we save the response time rounded in a dict
# so that 147 becomes 150, 3432 becomes 3400 and 58760 becomes 59000
@@ -173,6 +167,12 @@ class StatsEntry(object):
self.response_times.setdefault(rounded_response_time, 0)
self.response_times[rounded_response_time] += 1
+ # update min and max response times
+ if self.min_response_time is None:
+ self.min_response_time = rounded_response_time
+ self.min_response_time = min(self.min_response_time, rounded_response_time)
+ self.max_response_time = max(self.max_response_time, rounded_response_time)
+
def log_error(self, error):
self.num_failures += 1
self.stats.num_failures += 1 |
#331: Use rounded_response_time for min/max/total response times
@pior Looks like a fix was merged for this. Can this bug be closed? |
I can't really test that currently. |
Note that all latencies are multiple of 10. And 99% is higher than 100%.
The text was updated successfully, but these errors were encountered: