✨ LiteLLM Feb 2025 Roadmap + How to Contribute #8375
Replies: 4 comments 2 replies
-
Thank you for writing this @ishaan-jaff 🎉🎉 |
Beta Was this translation helpful? Give feedback.
-
cc: @jamesbraza @paul-gauthier @xingyaoww @okhat @joaomdmoura let us know if we missed anything here |
Beta Was this translation helpful? Give feedback.
-
Nice! Is with the Langfuse embedding point also the token monitoring included? Because for example if you start up an embedding model with VLLM, which gives also the tokencount as response I think at the moment LiteLLM is not forwarding them to langfuse. |
Beta Was this translation helpful? Give feedback.
-
Note: Next gen wildcard support should include intelligent model type discovery. This is because not every region has every LLM model type available. When given credentials for a specific region, LiteLLM should be able to use provider APIs to intelligently enumerate which model types are available for that region. Keep in mind, providers (e.g., GCP, AWS) do roll out new support for model types per region every month or so -- so this isn't a static one-time operation. My recommendation would be to have this setup on some sort of monthly cron schedule to regularly refresh this list from time-to-time. Bonus points: If a user in the Admin UI does a "model health" check, that also refreshes this list as well, on-demand. |
Beta Was this translation helpful? Give feedback.
-
✨ LiteLLM Feb 2025 Roadmap + How to Contribute
We’re excited for 2025! Below are key improvements with direct links to issues.
Our main focus areas are:
What would you like to see added, fixed, or improved in Feb 2025?
Detailed Issues & Feature Requests
How to Contribute (We need help) 🤗🤗
🌟 Goal: Complete 50 items by Feb 28th
🔧 LLM Translation
Bedrock
Bugs:
Features:
/bedrock/invoke/
route support for all Anthropic and Nova models/bedrock/invoke
OpenAI
Bugs:
Features:
Anthropic
Bugs:
Vertex AI
Bugs
New models / providers
General Improvements
Spend Tracking / Token Counting
Bugs:
Features:
📊 Logging (focus on Langfuse)
Bugs:
Features:
Proxy Admin UI Improvements
delete
button is hidden fix sizingGeneral Proxy Improvements
accept-data-loss
for prisma migrationsCaching
🔐 Security
Strengthen system security.
Bugs:
Features:
--
⚙️ Service Availability
Bugs:
Adding a New Issue
bedrock
,openai
,structured outputs
,logging
,langfuse
,security
,service availability
.Happy contributing!
Beta Was this translation helpful? Give feedback.
All reactions