Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Arbitrarily Repository Metrics #449

Open
fkorotkov opened this issue Sep 9, 2019 · 4 comments
Open

Support Arbitrarily Repository Metrics #449

fkorotkov opened this issue Sep 9, 2019 · 4 comments
Labels

Comments

@fkorotkov
Copy link
Contributor

Description

In addition to #93 it will be nice to expose an ability for users to expose arbitrary metrics like code coverage, benchmark score, amount of TODOs, etc.

Context

Cirrus can comment on PRs with comparison of such stats with the base of the PR. Plus expose badges.

Each stat should have properties like:

  • unit to define if it's a percent or an absolute number
  • merge_strategy to define how to merge stats with the same name from different tasks. Either override or average (which might be useful for benchmarking: run couple of tasks in parallel and take the average).

Syntax might look like this:

on_success:
  metric:
    name: performance
    unit: seconds
    merge_strategy: average
    value_script: cat benchmarking/performance.txt
@RDIL
Copy link
Contributor

RDIL commented Sep 9, 2019

Yes, this sounds so cool!

@tobim
Copy link

tobim commented Oct 1, 2019

on_success would be a key under task, right? If so, the metric name should be concatenated with the task name by default, and merging of metrics across tasks should only be done if requested.

@tobim
Copy link

tobim commented Oct 1, 2019

I recently found https://seriesci.com/ for an app that implements something like this in an orthogonal way.

@fkorotkov
Copy link
Contributor Author

Interesting! Will keep an eye on @seriesci! Thanks for sharing!

I was thinking about scoping metrics by build (aka SHA). That's why there is merge_strategy to merge metrics of the same name produced by different tasks within a build (you can run benchmark several times and take an average as a result).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants