Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Add volcengine/deepseek-r1 #8276

Open
issac2e opened this issue Feb 5, 2025 · 0 comments
Open

[Feature]: Add volcengine/deepseek-r1 #8276

issac2e opened this issue Feb 5, 2025 · 0 comments
Labels
enhancement New feature or request

Comments

@issac2e
Copy link

issac2e commented Feb 5, 2025

The Feature

Adding Deepseek-R1 deployed on volcengine, volcengine has supported the r1 model, but there is no reasoning_content in the r1 response parameters, can you add it?

https://docs.litellm.ai/docs/providers/volcano

Motivation, pitch

I am trying to access the Deepseek-R1 model through volcengine, and I can get the response result, but I found that "reasoning_content" parameters are missing in the response result, and I hope to adapt it

Inference content for the expected response body:
{
"choices": [
{
"finish_reason": "stop",
"index": 0,
"logprobs": null,
"message": {
"content": ".....",
"reasoning_content": ".....",
"role": "assistant"
}
}
],
"created": 1738752116,
"id": "0217387520712960fa2ee87f2976183a0d77cb9bf7fa6a37fcac1",
"model": "deepseek-r1-250120",
"object": "chat.completion",
"usage": {
"completion_tokens": 599,
"prompt_tokens": 30,
"total_tokens": 629,
"prompt_tokens_details": {
"cached_tokens": 0
},
"completion_tokens_details": {
"reasoning_tokens": 290
}
}
}

Are you a ML Ops Team?

No

Twitter / LinkedIn details

No response

@issac2e issac2e added the enhancement New feature or request label Feb 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant