Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapt buckets for latency metrics to better reflect production numbers #3366

Closed
5 tasks
romac opened this issue May 24, 2023 · 0 comments · Fixed by #3382
Closed
5 tasks

Adapt buckets for latency metrics to better reflect production numbers #3366

romac opened this issue May 24, 2023 · 0 comments · Fixed by #3382
Assignees
Labels
I: telemetry Internal: related to Telemetry & metrics
Milestone

Comments

@romac
Copy link
Member

romac commented May 24, 2023

Summary of Bug

In production it looks like most of the latency measurement fall in the +inf bucket. We should therefore adapt those to better reflect production latencies.

Version

Hermes 1.5.0

Steps to Reproduce

Screenshot 2023-05-24 at 17 58 30 Screenshot 2023-05-24 at 17 59 12

Acceptance Criteria


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate milestone (priority) applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned
@romac romac added the I: telemetry Internal: related to Telemetry & metrics label May 24, 2023
@romac romac added this to the v1.6 milestone May 24, 2023
@romac romac added this to Hermes May 24, 2023
@github-project-automation github-project-automation bot moved this to 🩹 Triage in Hermes May 24, 2023
@seanchen1991 seanchen1991 moved this from 🩹 Triage to 📥 Todo in Hermes May 30, 2023
@github-project-automation github-project-automation bot moved this from 📥 Todo to ✅ Done in Hermes Jun 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
I: telemetry Internal: related to Telemetry & metrics
Projects
Status: ✅ Done
Development

Successfully merging a pull request may close this issue.

2 participants