Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: drop_params works on individual models but not on global litellm_settings #7947

Closed
colindonovan-8451 opened this issue Jan 23, 2025 · 7 comments

Comments

@colindonovan-8451
Copy link

What happened?

We have run into issues when using the OpenAI library to query non-OpenAI embeddings models through LiteLLM. We have previously instituted a fix for this problem by adding drop_params=true to all effected embeddings models, but recently we tried to move that setting to the global litellm_settings block outside of our model_list. Unfortunately, when we moved this setting up a level, we began to see the error that we had seen before again.

Relevant log output

"litellm.UnsupportedParamsError: vertex_ai does not support parameters: {'encoding_format': 'base64'}, for model=text-embedding-005. To drop these, set `litellm.drop_params=True` or for proxy:\n\n`litellm_settings:\n drop_params: true`\

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.55.9

Twitter / LinkedIn details

No response

@colindonovan-8451 colindonovan-8451 added the bug Something isn't working label Jan 23, 2025
@krrishdholakia krrishdholakia self-assigned this Jan 23, 2025
@krrishdholakia
Copy link
Contributor

Hey @colindonovan-8451 just ran this and it works fine for me.

How did you set it on your config?

@ishaan-jaff
Copy link
Contributor

does this still occur @colindonovan-8451 ?

@ishaan-jaff
Copy link
Contributor

can you share how you set this on your config ?

@colindonovan-8451
Copy link
Author

Sorry for the delay - missed the initial response.

We were seeing this behavior on 1.55.9 when we set drop_params: true under the litellm_settings block at the top level of our config yaml. We ended up adding drop_params: true to all individual models to get around this error.

Image

@ishaan-jaff
Copy link
Contributor

Additional notes:

  • setting drop_params globally did not work for vertex text embeddings
  • setting model specific did work
  • User is using OpenAI python SDK

@krrishdholakia
Copy link
Contributor

closing as unable to repro.

@krrishdholakia krrishdholakia closed this as not planned Won't fix, can't repro, duplicate, stale Feb 11, 2025
@krrishdholakia
Copy link
Contributor

curl used to test:

curl -L -X POST 'http://0.0.0.0:4000/embeddings' \
-H 'Authorization: Bearer sk-1234' \
-H 'Content-Type: application/json' \
-d '{"input": ["hello world"], "model": "text-embedding-005", "encoding_format": "base64"}'

config.yaml

model_list:
  - model_name: text-embedding-005
    litellm_params:
      model: vertex_ai/text-embedding-005

litellm_settings:
  cache: true
  drop_params: true

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants