Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature request] LiteLLM support #43

Open
tobwen opened this issue Aug 27, 2024 · 2 comments
Open

[feature request] LiteLLM support #43

tobwen opened this issue Aug 27, 2024 · 2 comments
Assignees
Labels
enhancement New feature or request feature request

Comments

@tobwen
Copy link

tobwen commented Aug 27, 2024

feature request

Would it be possible to add LiteLLM support?

benefits

  • more than 100 LLMs would be accessable
  • retry/fallback logic across multiple deployments
  • track spend & set budgets

advantages

  • LiteLLM is self-hosted and actively developed
  • huge advantage: LiteLLM offers a OpenAI-compatible API, so there aren't much changes needed to the code base
  • others would take care of developing the API interface and you can concentrate fully on the UI
@harshitlakhani harshitlakhani added the enhancement New feature or request label Sep 3, 2024
@deep93333
Copy link
Collaborator

@tobwen Soon we'll add support for custom proxy server endpoint through which you'll be able to use LiteLLM

@deep93333 deep93333 assigned deep93333 and tobwen and unassigned deep93333 Sep 9, 2024
@ishaan-jaff
Copy link

Hi @tobwen , thanks for using LiteLLM. Any chance we can hop on a call to learn how we can improve LiteLLM Proxy for you ?

We’re planning roadmap and I’d love to get your feedback

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature request
Projects
None yet
Development

No branches or pull requests

4 participants