Request for Integration of Mistral Models into Langfuse #1535
Replies: 2 comments
-
Hi @Zherdev1996, Thank you for suggesting this. We like the mistral models and want to do more to support them natively in the future. For now, here are two ways to work around it: Cost & Token Usage: You can already add the Mistral Pricing to Langfuse by yourself, have a look at how here: https://langfuse.com/docs/model-usage-and-cost Mistral SDK: Conveniently, Mistral uses the same underlying logic as OpenAI in their SDK. So you can use the Langfuse OpenAI SDK and exchange the base url to mistral to use it to very conveniently track mistral executions: https://langfuse.com/docs/integrations/openai/get-started |
Beta Was this translation helpful? Give feedback.
-
We recently added a notebook on how to best wrap the new mistral sdk with the Langfuse Decorator: https://langfuse.com/guides/cookbook/integration_mistral_sdk |
Beta Was this translation helpful? Give feedback.
-
Describe the feature or potential improvement
Our team frequently utilizes Mistral AI models for various natural language processing tasks. We would like to request the addition of support for Mistral models on the Langfuse platform. This would enable us to manage our workflow within Langfuse, leveraging its robust features while utilizing the models we depend on. Pricing info - https://docs.mistral.ai/platform/pricing/
Additional information
No response
Beta Was this translation helpful? Give feedback.
All reactions