-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Mistral Codestral doesn't work #8279
Comments
Looks like a similar issue #8058 |
Assuming it's the same issue, I wrote the following in the other issue:
Is that what you're running into? |
I'm 100% sure the key is correct and copied from https://console.mistral.ai/codestral. It works directly through Postman and directly in Continue in VSCode/PyCharm. |
Similar problem for gpt-o1, gpt-o3-mini.
4o, 4o-mini, 1o-mini, 1o-preview works fine. |
@ohmyboroda OpenAI is saying you don't have model access
same here |
Can you please share the raw request made by litellm to the provider - use |
Ok, make sense. I will check it later. Thanks.
Ok, i will do that soon. |
@krrishdholakia apologies if this is a basic question, but where is the config.yaml file located for the command: Does LiteLLM generate this file automatically, or do I need to create it manually? If so, what is the recommended location? Thanks in advance for your help! |
It's user defined. I'm confused, how did you run the proxy initially? |
Self-hosted Docker Compose. |
but how did you add the codestral model? |
Accidentally completed it. |
What happened?
When sending a POST request to /v1/completions, an error occurs
How to repeat:
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.60.2
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: