Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initial plugin #1

Closed
simonw opened this issue Aug 20, 2023 · 3 comments
Closed

Initial plugin #1

simonw opened this issue Aug 20, 2023 · 3 comments
Labels
enhancement New feature or request

Comments

@simonw
Copy link
Owner

simonw commented Aug 20, 2023

Goal is to allow people to llm install llm-openrouter and set an API key and instantly get access to the full set of openrouter.ai models - as described in this JSON: https://openrouter.ai/api/v1/models

This works in LLM main right now using extra-openai-models.yml but I think it could be more usable as a separate plugin:

Documented here: https://llm.datasette.io/en/latest/other-models.html#extra-http-headers

- model_id: claude
  model_name: anthropic/claude-2
  api_base: "https://openrouter.ai/api/v1"
  api_key_name: openrouter
  headers:
    HTTP-Referer: "https://llm.datasette.io/"
    X-Title: LLM
@simonw simonw added the enhancement New feature or request label Aug 20, 2023
@simonw
Copy link
Owner Author

simonw commented Aug 20, 2023

Installing the plugin will start caching a local copy of https://openrouter.ai/api/v1/models - as seen in llm-gpt4all: https://github.com/simonw/llm-gpt4all/blob/0046e2bf5d0a9c369b804d7125a1ab50bd5878f1/llm_gpt4all.py#L26-L31

The JSON format looks like this:

{
  "data": [
    {
      "id": "openai/gpt-3.5-turbo",
      "pricing": {
        "prompt": "0.0000015",
        "completion": "0.000002"
      },
      "context_length": 4095,
      "per_request_limits": {
        "prompt_tokens": "2871318",
        "completion_tokens": "2153488"
      }
    },
    {
      "id": "openai/gpt-3.5-turbo-0301",
      "pricing": {
        "prompt": "0.0000015",
        "completion": "0.000002"
      },
      "context_length": 4095,
      "per_request_limits": {
        "prompt_tokens": "2871318",
        "completion_tokens": "2153488"
      }
    },

I'm going to set model IDs of openrouter/openai/gpt-3.5-turbo-0301 - a bit wordy, but makes it clear that they differ from the same models from other plugins.

Users can set aliases if they want shorter model indicators.

@simonw
Copy link
Owner Author

simonw commented Aug 20, 2023

This is going to work by importing Chat from llm.default_plugins.openai_models.

At some point soon I'm going to move the OpenAI stuff out of LLM core and into a plugin, at which point this plugin will need to depend on it.

@simonw
Copy link
Owner Author

simonw commented Aug 20, 2023

... or maybe I should do that first.

simonw added a commit that referenced this issue Aug 20, 2023
@simonw simonw closed this as completed Aug 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant