Skip to content

Commit

Permalink
Litellm dev 01 22 2025 p1 (#7933)
Browse files Browse the repository at this point in the history
* docs(docker_quick_start.md): add more troubleshooting guides

* test(test_fallbacks.py): add e2e test for proxy with fallbacks + custom fallback message

* test(test_bedrock_completion.py): skip test now that bedrock supports this behaviour

* test(test_fireworks_ai_translation.py): mock fireworks ai test
  • Loading branch information
krrishdholakia authored Jan 23, 2025
1 parent 760ba4d commit e3bacf7
Show file tree
Hide file tree
Showing 3 changed files with 103 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/my-website/docs/proxy/deploy.md
Original file line number Diff line number Diff line change
Expand Up @@ -1048,3 +1048,4 @@ export DATABASE_SCHEMA="schema-name" # skip to use the default "public" schema
```bash
litellm --config /path/to/config.yaml --iam_token_db_auth
```
52 changes: 52 additions & 0 deletions docs/my-website/docs/proxy/docker_quick_start.md
Original file line number Diff line number Diff line change
Expand Up @@ -382,6 +382,56 @@ litellm_settings:
ssl_verify: false # 👈 KEY CHANGE
```


### (DB) All connection attempts failed


If you see:

```
httpx.ConnectError: All connection attempts failed
ERROR: Application startup failed. Exiting.
3:21:43 - LiteLLM Proxy:ERROR: utils.py:2207 - Error getting LiteLLM_SpendLogs row count: All connection attempts failed
```

This might be a DB permission issue.

1. Validate db user permission issue

Try creating a new database.

```bash
STATEMENT: CREATE DATABASE "litellm"
```

If you get:

```
ERROR: permission denied to create
```

This indicates you have a permission issue.

2. Grant permissions to your DB user

It should look something like this:

```
psql -U postgres
```

```
CREATE DATABASE litellm;
```
On CloudSQL, this is:
```
GRANT ALL PRIVILEGES ON DATABASE litellm TO your_username;
```
**What is `litellm_settings`?**
LiteLLM Proxy uses the [LiteLLM Python SDK](https://docs.litellm.ai/docs/routing) for handling LLM API calls.
Expand All @@ -398,3 +448,5 @@ LiteLLM Proxy uses the [LiteLLM Python SDK](https://docs.litellm.ai/docs/routing
[![Chat on WhatsApp](https://img.shields.io/static/v1?label=Chat%20on&message=WhatsApp&color=success&logo=WhatsApp&style=flat-square)](https://wa.link/huol9n) [![Chat on Discord](https://img.shields.io/static/v1?label=Chat%20on&message=Discord&color=blue&logo=Discord&style=flat-square)](https://discord.gg/wuPM9dRgDw)
50 changes: 50 additions & 0 deletions tests/test_fallbacks.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,3 +111,53 @@ async def test_chat_completion_client_fallbacks(has_access):
except Exception as e:
if has_access:
pytest.fail("Expected this to work: {}".format(str(e)))


@pytest.mark.parametrize("has_access", [True, False])
@pytest.mark.asyncio
async def test_chat_completion_client_fallbacks_with_custom_message(has_access):
"""
make chat completion call with prompt > context window. expect it to work with fallback
"""

async with aiohttp.ClientSession() as session:
models = ["gpt-3.5-turbo"]

if has_access:
models.append("gpt-instruct")

## CREATE KEY WITH MODELS
generated_key = await generate_key(session=session, i=0, models=models)
calling_key = generated_key["key"]
model = "gpt-3.5-turbo"
messages = [
{"role": "user", "content": "Who was Alexander?"},
]

## CALL PROXY
try:
await chat_completion(
session=session,
key=calling_key,
model=model,
messages=messages,
mock_testing_fallbacks=True,
fallbacks=[
{
"model": "gpt-instruct",
"messages": [
{
"role": "assistant",
"content": "This is a custom message",
}
],
}
],
)
if not has_access:
pytest.fail(
"Expected this to fail, submitted fallback model that key did not have access to"
)
except Exception as e:
if has_access:
pytest.fail("Expected this to work: {}".format(str(e)))

0 comments on commit e3bacf7

Please sign in to comment.