Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add server metrics #273

Merged
merged 8 commits into from
Apr 15, 2023

Conversation

matthisholleville
Copy link
Contributor

Closes #

📑 Description

This commit add metrics path & the analyzer error metrics in the codebase.

The changes have been made across all analyzers and include the addition of a new metric with label values for the analyzer's name, analyzed object's name, and namespace.

The metric's value is set to the length of the analyzer objects failures.

✅ Checks

  • My pull request adheres to the code style of this project
  • My code requires changes to the documentation
  • I have updated the documentation as required
  • All the tests have passed

ℹ Additional Information

$ curl http://localhost:8080/metrics
# HELP analyzer_errors Number of errors detected by analyzer
# TYPE analyzer_errors gauge
analyzer_errors{analyzer_name="HorizontalPodAutoscaler",namespace="k8sgpt",object_name="fake-hpa"} 1
analyzer_errors{analyzer_name="Ingress",namespace="k8sgpt",object_name="fake-ingress"} 3
...

This commit add metrics path & the analyzer error metrics in the codebase. The changes have been made across all analyzers and include the addition of a new metric with label values for the analyzer's name, analyzed object's name, and namespace. The metric's value is set to the length of the analyzer objects failures.

Signed-off-by: Matthis Holleville <matthish29@gmail.com>
Signed-off-by: Matthis Holleville <matthish29@gmail.com>
Signed-off-by: Matthis Holleville <matthish29@gmail.com>
@matthisholleville matthisholleville requested review from a team as code owners April 14, 2023 13:17
Copy link
Contributor

@thschue thschue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good work, I think the Metric should be a Counter instead of a Gauge, but the rest look good!

pkg/analyzer/analyzer.go Show resolved Hide resolved
@matthisholleville
Copy link
Contributor Author

matthisholleville commented Apr 15, 2023

Thank you for your feedback @thschue. From my understanding, the metrics can increase or decrease depending on the number of issues. The user can partially fix errors and the metrics will decrease accordingly.

On the other hand, the counter can only increase which is not our case.

What do you think?

@thschue
Copy link
Contributor

thschue commented Apr 15, 2023

Thank you for your feedback @thschue. From my understanding, the metrics can increase or decrease depending on the number of issues. The user can partially fix errors and the metrics will decrease accordingly.

On the other hand, the counter can only increase which is not our case.

What do you think?

Sorry, had the wrong use-case in mind. The gauge is ok in this case!

Copy link
Contributor

@thschue thschue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@thschue thschue merged commit a3becc9 into k8sgpt-ai:main Apr 15, 2023
@nunoadrego nunoadrego mentioned this pull request Apr 17, 2023
4 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

None yet

3 participants