Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow math on label_format expressions #2955

Closed
owen-d opened this issue Nov 19, 2020 · 8 comments · Fixed by #3434
Closed

Allow math on label_format expressions #2955

owen-d opened this issue Nov 19, 2020 · 8 comments · Fixed by #3434
Labels
keepalive An issue or PR that will be kept alive and never marked as stale.

Comments

@owen-d
Copy link
Member

owen-d commented Nov 19, 2020

I would suggest we open up label_format stages to allow mathematical operations. For instance, say you wanted counts based on status code groups (2xx, 4xx, etc).

I'd like to use something like this, taking into account integer division to do the grouping: | json | status="{{.status / 100}}"

/cc @cyriltovena @chancez WDYT?

@chancez
Copy link
Contributor

chancez commented Nov 19, 2020

So an example using that might be like this?:

sum by (status_bucket) (count_over_time({} | logfmt | label_format status_bucket="{{ div .status 100}}"))

Also what about cases where the value of the label isn't fully known, for example with a duration field. Status has a fixed number of values (0-599 roughly speaking) so it's reasonable to just divide by 100. With a duration, we can do something similar, but what if we wanted to have a fixed number of buckets for a field with an unknown range of values. Or what if we wanted to have an unspecified amount of fixed size bucket, or a specific amount of fixed size buckets, with overflow going into the last bucket (or a spill-over bucket?).

@owen-d
Copy link
Member Author

owen-d commented Nov 19, 2020

It'll probably be helpful to expose the ability to clamp values to [min,max], operate on parse-able durations, etc, as you've suggested.

@stale
Copy link

stale bot commented Dec 20, 2020

This issue has been automatically marked as stale because it has not had any activity in the past 30 days. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale A stale issue or PR that will automatically be closed. label Dec 20, 2020
@chancez
Copy link
Contributor

chancez commented Dec 20, 2020

Not stale. After holidays discussion can continue.

@stale stale bot removed the stale A stale issue or PR that will automatically be closed. label Dec 20, 2020
@stale
Copy link

stale bot commented Jan 20, 2021

This issue has been automatically marked as stale because it has not had any activity in the past 30 days. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale A stale issue or PR that will automatically be closed. label Jan 20, 2021
@chancez
Copy link
Contributor

chancez commented Jan 20, 2021

Not stale

@stale stale bot removed the stale A stale issue or PR that will automatically be closed. label Jan 20, 2021
@stale
Copy link

stale bot commented Feb 21, 2021

This issue has been automatically marked as stale because it has not had any activity in the past 30 days. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale A stale issue or PR that will automatically be closed. label Feb 21, 2021
@chancez
Copy link
Contributor

chancez commented Feb 22, 2021

@owen-d can we get a keep alive label?

@stale stale bot removed the stale A stale issue or PR that will automatically be closed. label Feb 22, 2021
@adityacs adityacs added the keepalive An issue or PR that will be kept alive and never marked as stale. label Mar 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
keepalive An issue or PR that will be kept alive and never marked as stale.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants