-
-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Initial plugin #1
Comments
Installing the plugin will start caching a local copy of https://openrouter.ai/api/v1/models - as seen in The JSON format looks like this: {
"data": [
{
"id": "openai/gpt-3.5-turbo",
"pricing": {
"prompt": "0.0000015",
"completion": "0.000002"
},
"context_length": 4095,
"per_request_limits": {
"prompt_tokens": "2871318",
"completion_tokens": "2153488"
}
},
{
"id": "openai/gpt-3.5-turbo-0301",
"pricing": {
"prompt": "0.0000015",
"completion": "0.000002"
},
"context_length": 4095,
"per_request_limits": {
"prompt_tokens": "2871318",
"completion_tokens": "2153488"
}
}, I'm going to set model IDs of Users can set aliases if they want shorter model indicators. |
This is going to work by importing At some point soon I'm going to move the OpenAI stuff out of LLM core and into a plugin, at which point this plugin will need to depend on it. |
... or maybe I should do that first. |
Goal is to allow people to
llm install llm-openrouter
and set an API key and instantly get access to the full set ofopenrouter.ai
models - as described in this JSON: https://openrouter.ai/api/v1/modelsThis works in LLM
main
right now usingextra-openai-models.yml
but I think it could be more usable as a separate plugin:Documented here: https://llm.datasette.io/en/latest/other-models.html#extra-http-headers
The text was updated successfully, but these errors were encountered: