-
Notifications
You must be signed in to change notification settings - Fork 60.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Support streamWithThink for other models #6128
Comments
! [Image] (https://github.com/user-attachments/aSSETS/E037AB30-5F53-410d-b74b-a537ca019612)) |
related: #6141 |
related: #6141 |
在 #6186 修复了格式 |
Fixed format at #6186 |
可以做成一个通用的方案吗?现在看起来只对SiliconFlow生效。 |
Can it be made into a general solution? Now it looks like it only works for SiliconFlow. |
因为只有openai原版api带有的的规范变成通用的自定义endpoint才比较合适,现在这个field是deepseek 特有的,所以不太好一口气全改(比如有些模型输出有可能是 |
Because it is more appropriate to only the specifications included in the original openai API become general custom endpoints, this field is now unique to deepseek, so it is not easy to modify them all in one go (for example, some model output may be ). The code now is general in itself, and each model implements its own parseSSE function. |
🥰 Feature Description
Currently, only the
deepseek-reasoner
model supportsstreamWithThink
. However, when we use a third-party DeepSeek API through the OpenAI interface,streamWithThink
is disabled.🧐 Proposed Solution
I have a naive idea but I'm not sure if it‘ll work: detect
<think>
in stream for all models and turn on "think" mode automatically.📝 Additional Information
No response
The text was updated successfully, but these errors were encountered: