Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: add llm request count on prometheus metric #786

Closed
2 tasks done
JuHyung-Son opened this issue Dec 5, 2023 · 2 comments
Closed
2 tasks done

[Feature]: add llm request count on prometheus metric #786

JuHyung-Son opened this issue Dec 5, 2023 · 2 comments
Assignees

Comments

@JuHyung-Son
Copy link
Contributor

Checklist

  • I've searched for similar issues and couldn't find anything matching
  • I've discussed this feature request in the K8sGPT Slack and got positive feedback

Is this feature request related to a problem?

None

Problem Description

usually llm is paid service.
it would be better if user can monitor llm request count

Solution Description

add llm request count label by backend in k8sgpt config.
prometheus metric e.g. [ deployment="k8sgpt-test-example" namespace="default" type="counter" backend="openai"] = 0

Benefits

user can count llm request

Potential Drawbacks

No response

Additional Information

No response

@AlexsJones
Copy link
Member

AlexsJones commented Dec 7, 2023

#789 please review

@AlexsJones AlexsJones self-assigned this Dec 7, 2023
@AlexsJones
Copy link
Member

Moved into the operator k8sgpt-ai/k8sgpt-operator#290

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Done
Development

No branches or pull requests

2 participants