Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: When using LiteLLM Proxy with tool calling, Autogen and AWS Bedrock Claude, Bedrock errors when content fields are empty #4820

Open
seam-ctooley opened this issue Jul 22, 2024 · 16 comments
Labels
bedrock bug Something isn't working feb 2025 help wanted Extra attention is needed

Comments

@seam-ctooley
Copy link

What happened?

Setup:

  • LiteLLM Proxy configured to hit AWS Bedrock Claude 3.5 sonnet
  • Autogen Agent configured to use the LiteLLM proxy with tool calling

Autogen Agent:

CLAUDE_CONFIG = {
    "config_list": [
        {
            "model": "anthropic.claude-3-5-sonnet-20240620-v1:0",  # Loaded with LiteLLM command
            "api_key": "NotRequired",  # Not needed
            "base_url": "http://localhost:4000/",  # Your LiteLLM URL
        }
    ],
    "cache_seed": None,
}

LiteLLM Proxy Config

model_list:
  - model_name: anthropic.claude-3-5-sonnet-20240620-v1:0
    litellm_params:
      model: bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
      aws_region_name: us-east-1

litellm_settings:
  drop_params: True

Minimal Reproducible Autogen setup:

import autogen
from typing import Literal, Annotated

chatbot = autogen.AssistantAgent(
    name="chatbot",
    system_message="For currency exchange tasks, only use the functions you have been provided with. Reply TERMINATE when the task is done.",
    llm_config={
        "config_list": [
            {
                "model": "anthropic.claude-3-5-sonnet-20240620-v1:0",  # Loaded with LiteLLM command
                "api_key": "NotRequired",  # Not needed
                "base_url": "http://localhost:4000/",  # Your LiteLLM URL
            }
        ],
        "cache_seed": None,
    },
)

# create a UserProxyAgent instance named "user_proxy"
user_proxy = autogen.UserProxyAgent(
    name="user_proxy",
    is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
    human_input_mode="NEVER",
    max_consecutive_auto_reply=10,
)

CurrencySymbol = Literal["USD", "EUR"]


def exchange_rate(base_currency: CurrencySymbol, quote_currency: CurrencySymbol) -> float:
    if base_currency == quote_currency:
        return 1.0
    elif base_currency == "USD" and quote_currency == "EUR":
        return 1 / 1.1
    elif base_currency == "EUR" and quote_currency == "USD":
        return 1.1
    else:
        raise ValueError(f"Unknown currencies {base_currency}, {quote_currency}")


@user_proxy.register_for_execution()
@chatbot.register_for_llm(description="Currency exchange calculator.")
def currency_calculator(
    base_amount: Annotated[float, "Amount of currency in base_currency"],
    base_currency: Annotated[CurrencySymbol, "Base currency"] = "USD",
    quote_currency: Annotated[CurrencySymbol, "Quote currency"] = "EUR",
) -> str:
    quote_amount = exchange_rate(base_currency, quote_currency) * base_amount
    return f"{quote_amount} {quote_currency}"

res = user_proxy.initiate_chat(
    chatbot, message="How much is 123.45 USD in EUR?"
)

Relevant log output

09:25:12 - LiteLLM Proxy:DEBUG: proxy_server.py:3004 - An error occurred: litellm.BadRequestError: BedrockException - {"message":"The text field in the ContentBlock object at messages.2.content.0 is blank. Add text to the text field, and try again."} LiteLLM Retried: 1 times, LiteLLM Max Retries: 2 None

Twitter / LinkedIn details

No response

@seam-ctooley seam-ctooley added the bug Something isn't working label Jul 22, 2024
@krrishdholakia
Copy link
Contributor

Hey @seam-ctooley what version of autogen is this? i can't seem to run your script
Screenshot 2024-07-22 at 2 33 04 PM

@krrishdholakia
Copy link
Contributor

also can you run your proxy with --detailed_debug? it should print the raw request being made, which should help with repro

@seam-ctooley
Copy link
Author

I'm on the latest Autogen version: here is a repo that reproduces the issue I'm seeing.
https://github.com/seam-ctooley/litellm-bedrock-bug-repro

I've got a detailed debug log, but it seems to contain AWS creds. I'll share it tomorrow once my session expires. If we could share it over Discord as well, that would be greatly appreciated. I am "christiant_47581" on the LiteLLM server

@seam-ctooley
Copy link
Author

stderr.txt
Here is the full log file @krrishdholakia

@dbpprt
Copy link

dbpprt commented Jul 26, 2024

Same issue here with latest LiteLLM running locally, Autogen and Claude 3 Haiku.

@haandol
Copy link

haandol commented Jul 30, 2024

same here

@astroalek
Copy link

Hey @seam-ctooley what version of autogen is this? i can't seem to run your script Screenshot 2024-07-22 at 2 33 04 PM

This usually occurs when you install "autogen" instead of "pyautogen"

@seam-ctooley
Copy link
Author

I've been able to get around the issues mentioned here by using Autogen directly with a custom client https://gist.github.com/seam-ctooley/d22f8319f313bc160388ae5949cc20b8

So I imagine the issue lies with the translation layer to Bedrock, specific format requirements with tool calling that aren't being met.

@yaronr
Copy link

yaronr commented Jan 1, 2025

Same issue here

@krrishdholakia
Copy link
Contributor

krrishdholakia commented Jan 1, 2025

Hey @yaronr what version of litellm are you seeing this error on?

The bedrock content field error was recently fixed (v1.55.4+) - #7169

@yaronr
Copy link

yaronr commented Jan 2, 2025

Hey @krrishdholakia
It was the latest docker image at the day I posted the comment

@krrishdholakia
Copy link
Contributor

can you run the proxy with --detailed_debug it should emit the request received by the proxy,

if you can share that + the latest stacktrace, that would be helpful

@csmizzle
Copy link

Same issue here

@yaronr
Copy link

yaronr commented Jan 29, 2025

@krrishdholakia I can try, but I'm not sure exactly on which notebook this. I'm going through the langgraph academy online course, replacing all the OpenAI LLMs with Litellm. Or trying to :)
This could be a nice QA test for you guys, BTW.
I also posted another issue on discord, regarding tool_choice=any (which I think is an easy fix)

@ishaan-jaff
Copy link
Contributor

is the issue still occurring - can we get a better way to repro it? Ideally with a litellm.completion request we can run on our side @yaronr @csmizzle @seam-ctooley

@ishaan-jaff ishaan-jaff added bedrock feb 2025 help wanted Extra attention is needed labels Feb 7, 2025
@tuananh
Copy link

tuananh commented Feb 12, 2025

this issue is kinda intermittent for me. im not sure what trigger it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bedrock bug Something isn't working feb 2025 help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

9 participants