Replies: 2 comments 7 replies
-
Will add to backlog, and if this ticket gets a lot of demand we can re-prioritize |
Beta Was this translation helpful? Give feedback.
3 replies
-
Is it not sufficient to simply change the api base to point to openrouter for langsmith integration to work? |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Feature request
OpenRouter is very popular as a provider for LLM inference, and should be available for use in playground with all of its models, next to OpenAI etc.
Motivation
Its a pity that it is not available in LangSmith Playground. Meaning I can't really do experiments with traces since all of our production inference is done on OpenRouter LLMs, and some of these LLMs are not even available with other providers.
Beta Was this translation helpful? Give feedback.
All reactions