From 9418b179c2511cbdd42b4bc65e6e27d19eed1271 Mon Sep 17 00:00:00 2001 From: r48Bit <81687400+r4881t@users.noreply.github.com> Date: Mon, 6 May 2024 21:48:01 +0530 Subject: [PATCH] Update to correct pip install for litellm (#2602) The doc mentions `pip install litellm[proxy]` which won't work. The correct command is `pip install 'litellm[proxy]'`. --- website/docs/topics/non-openai-models/local-litellm-ollama.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/topics/non-openai-models/local-litellm-ollama.md b/website/docs/topics/non-openai-models/local-litellm-ollama.md index 98b326acdf4..e9c4b6ba345 100644 --- a/website/docs/topics/non-openai-models/local-litellm-ollama.md +++ b/website/docs/topics/non-openai-models/local-litellm-ollama.md @@ -18,7 +18,7 @@ Note: We recommend using a virtual environment for your stack, see [this article Install LiteLLM with the proxy server functionality: ```bash -pip install litellm[proxy] +pip install 'litellm[proxy]' ``` Note: If using Windows, run LiteLLM and Ollama within a [WSL2](https://learn.microsoft.com/en-us/windows/wsl/install).