Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🚀 功能建议:when user open build in LLMs validate it useable #714

Open
5 tasks done
AkaShark opened this issue Nov 4, 2024 · 6 comments
Open
5 tasks done
Assignees
Labels
enhancement New feature or request fixed in next release The issue will be closed once next release is available

Comments

@AkaShark
Copy link
Collaborator

AkaShark commented Nov 4, 2024

请先确认以下事项

  • 已仔细阅读了 README
  • issues 页面搜索过(包括已关闭的 issue),未发现类似功能建议
  • Easydict 已升级到 最新版本
  • 我理解并认可上述内容,并理解项目维护者精力有限,不遵循规则的 issue 可能会被无视或直接关闭

功能描述

Build in LLMs quota is used up, but I can open it always. I thinks we shuold validate build in LLMs when user open it.

CleanShot 2024-11-04 at 23 43 52@2x

CleanShot 2024-11-04 at 23 48 50@2x

使用场景

when user open build in LLMs

实现方案(可选)

No response

是否愿意提交 PR 实现该功能

  • 我愿意提交 PR 实现该功能
@AkaShark AkaShark added the enhancement New feature or request label Nov 4, 2024
@AkaShark AkaShark self-assigned this Nov 4, 2024
@tisfeng
Copy link
Owner

tisfeng commented Nov 5, 2024

That may not be necessary.

We can't shut down this service just because the current request is failing, because this quota is uncertain, it changes dynamically, and even if it's temporarily exhausted, it may be added later and be available again.

@AkaShark
Copy link
Collaborator Author

AkaShark commented Nov 5, 2024

That may not be necessary.

We can't shut down this service just because the current request is failing, because this quota is uncertain, it changes dynamically, and even if it's temporarily exhausted, it may be added later and be available again.

Yeah, there may be some misunderstandings. I mean to check and prompt the user when the switch is turned on, not to close the service.

@tisfeng
Copy link
Owner

tisfeng commented Nov 5, 2024

To avoid misuse of the built-in AI service, so I limit its quota, which I usually replenish periodically.

However, we are using some new models for this update, such as glm-4-flash, and since one-api defaults to a multiplier of 30, it results in a very high quota cost, which uses up the quota all at once.

Now I have set the credit to unlimited. This problem will not occur again.

image

@AkaShark
Copy link
Collaborator Author

AkaShark commented Nov 5, 2024

To avoid misuse of the built-in AI service, so I limit its quota, which I usually replenish periodically.

However, we are using some new models for this update, such as glm-4-flash, and since one-api defaults to a multiplier of 30, it results in a very high quota cost, which uses up the quota all at once.

Now I have set the credit to unlimited. This problem will not occur again.

image

But if it's an unlimited quota, wouldn't that cause very high bills?

@tisfeng
Copy link
Owner

tisfeng commented Nov 5, 2024

That may not be necessary.
We can't shut down this service just because the current request is failing, because this quota is uncertain, it changes dynamically, and even if it's temporarily exhausted, it may be added later and be available again.

Yeah, there may be some misunderstandings. I mean to check and prompt the user when the switch is turned on, not to close the service.

You're right, if the currently enabled service is in an unavailable state, we should prompt the user and not allow them to enable this service.

@tisfeng
Copy link
Owner

tisfeng commented Nov 5, 2024

ok, I've reverted the change and set it to $100. If it's not abused, this quota can be used a long time.

image

@AkaShark AkaShark added the fixed in next release The issue will be closed once next release is available label Nov 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request fixed in next release The issue will be closed once next release is available
Projects
None yet
Development

No branches or pull requests

2 participants