-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: When using LiteLLM Proxy with tool calling, Autogen and AWS Bedrock Claude, Bedrock errors when content fields are empty #4820
Comments
Hey @seam-ctooley what version of autogen is this? i can't seem to run your script |
also can you run your proxy with |
I'm on the latest Autogen version: here is a repo that reproduces the issue I'm seeing. I've got a detailed debug log, but it seems to contain AWS creds. I'll share it tomorrow once my session expires. If we could share it over Discord as well, that would be greatly appreciated. I am "christiant_47581" on the LiteLLM server |
stderr.txt |
Same issue here with latest LiteLLM running locally, Autogen and Claude 3 Haiku. |
same here |
This usually occurs when you install "autogen" instead of "pyautogen" |
I've been able to get around the issues mentioned here by using Autogen directly with a custom client https://gist.github.com/seam-ctooley/d22f8319f313bc160388ae5949cc20b8 So I imagine the issue lies with the translation layer to Bedrock, specific format requirements with tool calling that aren't being met. |
Same issue here |
Hey @krrishdholakia |
can you run the proxy with if you can share that + the latest stacktrace, that would be helpful |
Same issue here |
@krrishdholakia I can try, but I'm not sure exactly on which notebook this. I'm going through the langgraph academy online course, replacing all the OpenAI LLMs with Litellm. Or trying to :) |
is the issue still occurring - can we get a better way to repro it? Ideally with a litellm.completion request we can run on our side @yaronr @csmizzle @seam-ctooley |
this issue is kinda intermittent for me. im not sure what trigger it. |
What happened?
Setup:
Autogen Agent:
LiteLLM Proxy Config
Minimal Reproducible Autogen setup:
Relevant log output
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: