-
Notifications
You must be signed in to change notification settings - Fork 374
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚀 功能建议:when user open build in LLMs validate it useable #714
Comments
That may not be necessary. We can't shut down this service just because the current request is failing, because this quota is uncertain, it changes dynamically, and even if it's temporarily exhausted, it may be added later and be available again. |
Yeah, there may be some misunderstandings. I mean to check and prompt the user when the switch is turned on, not to close the service. |
To avoid misuse of the built-in AI service, so I limit its quota, which I usually replenish periodically. However, we are using some new models for this update, such as glm-4-flash, and since one-api defaults to a multiplier of 30, it results in a very high quota cost, which uses up the quota all at once. Now I have set the credit to unlimited. This problem will not occur again. |
But if it's an unlimited quota, wouldn't that cause very high bills? |
You're right, if the currently enabled service is in an unavailable state, we should prompt the user and not allow them to enable this service. |
请先确认以下事项
功能描述
Build in LLMs quota is used up, but I can open it always. I thinks we shuold validate build in LLMs when user open it.
使用场景
when user open build in LLMs
实现方案(可选)
No response
是否愿意提交 PR 实现该功能
The text was updated successfully, but these errors were encountered: