-
-
Notifications
You must be signed in to change notification settings - Fork 10k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
能否提供个文件用于修改默认设置 #1196
Comments
🥰 Description of requirementsJust like nextweb, you can set the compression threshold and limit the number of history entries by default, so that it can continue a conversation similar to nextweb, and the api fee will not increase due to overly long conversations. Especially now that gpt4 can upload more text, ordinary users I didn't realize that this behavior, because the number of items is not limited by default, will cause the API price to rise. 🧐 Solutionno 📝 Supplementary informationNo response |
👀 @cokice Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
这个倒可以变成 LobeChat 的默认设定,比如默认10条上下文这样。 需要设定默认值的话,等待 #913 实现,上一个 PR #1180 已经为此做好铺垫了,实现起来相对容易得
比如你之前用 4k 上下文的 gpt-3.5,是非常容易用完的,这个是拿来做提示的。我们新版本会改成当前使用量 |
This can be turned into the default setting of LobeChat, such as the default 10 contexts. If you need to set a default value, wait for #913 to be implemented. The previous PR #1180 has already paved the way for this, and it is relatively easy to implement.
For example, the gpt-3.5 you used in the 4k context before is very easy to use up. This is used as a reminder. Our new version will be changed to the current usage |
✅ @cokice This issue is closed, If you have any questions, you can comment and reply. |
🎉 This issue has been resolved in version 0.128.0 🎉 The release is available on: Your semantic-release bot 📦🚀 |
🥰 需求描述
就像nextweb一样,可以默认设置好压缩阈值和限制历史条数,使其类似nextweb连续对话下去,不会因为过长的对话导致api费用增加,特别是现在gpt4能上传更多的文字,普通用户没意识到这种行为,因为默认不限制条数,会导致api价格攀升。
还有作者大大我目前还是没有理解那个聊天的时候剩余token的用处是啥。
🧐 解决方案
no
📝 补充信息
No response
The text was updated successfully, but these errors were encountered: