Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

api_key_name isn't used in extra-openai-models.yaml if OPENAI_API_KEY env var is set #158

Closed
itsderek23 opened this issue Aug 16, 2023 · 14 comments
Labels
bug Something isn't working
Milestone

Comments

@itsderek23
Copy link

itsderek23 commented Aug 16, 2023

If I add an openai-compatible model to extra-openai-models.yaml and specify an api_key_name, the CLI does not appear to use the key value. Instead, it appears to use the openai key.

Example key file:

{
  "// Note": "This file stores secret API credentials. Do not share!",
  "openai": "sk-redacted",
  "opstower": "redacted"
}

Example CLI command (inspecting server logs shows that the openai key value was passed):

llm -m opstower 'how many ec2 instances'

This works:

llm -m opstower 'how many ec2 instances' --key opstower

Orignally mentioned by @klauern in #139 (comment)_

This is on version 0.7.

@itsderek23
Copy link
Author

I believe this happens if the env var OPENAI_API_KEY is set. If I unset this, it works.

Looking at get_key, the env_var is checked prior to default_key.

@itsderek23 itsderek23 changed the title api_key_name doesn't appear to be used in extra-openai-models.yaml api_key_name isn't used in extra-openai-models.yaml if OPENAI_API_KEY env var is set Aug 16, 2023
@simonw simonw added the bug Something isn't working label Aug 17, 2023
@simonw
Copy link
Owner

simonw commented Aug 17, 2023

Relevant code:

llm/llm/__init__.py

Lines 97 to 105 in a4f55e9

def get_key(key_arg, default_key, env_var=None):
keys = load_keys()
if key_arg in keys:
return keys[key_arg]
if key_arg:
return key_arg
if env_var and os.environ.get(env_var):
return os.environ[env_var]
return keys.get(default_key)

@simonw
Copy link
Owner

simonw commented Aug 17, 2023

I don't like those variable names, and that function needs proper documentation. It's confusing to look at.

@simonw
Copy link
Owner

simonw commented Aug 17, 2023

Oh that's weird... the main place that keys come from doesn't even call that utility method!

llm/llm/models.py

Lines 209 to 234 in a4f55e9

class Model(ABC):
model_id: str
key: Optional[str] = None
needs_key: Optional[str] = None
key_env_var: Optional[str] = None
can_stream: bool = False
class Options(_Options):
pass
def get_key(self):
if self.needs_key is None:
return None
if self.key is not None:
return self.key
if self.key_env_var is not None:
key = os.environ.get(self.key_env_var)
if key:
return key
message = "No key found - add one using 'llm keys set {}'".format(
self.needs_key
)
if self.key_env_var:
message += " or set the {} environment variable".format(self.key_env_var)
raise NeedsKeyException(message)

@simonw
Copy link
Owner

simonw commented Aug 17, 2023

The main place the get_key() function is called is here:

llm/llm/cli.py

Lines 211 to 213 in a4f55e9

# Provide the API key, if one is needed and has been provided
if model.needs_key:
model.key = get_key(key, model.needs_key, model.key_env_var)

Where key is the --key X option from the CLI call.

simonw added a commit that referenced this issue Aug 17, 2023
The actual logic is unchanged, but it is a lot easier to understand what it does now.

Refs #158
@itsderek23
Copy link
Author

I'll try to do a PR for this next week. Thanks for the pointers.

@simonw
Copy link
Owner

simonw commented Aug 18, 2023

I wrote this:

diff --git a/llm/models.py b/llm/models.py
index 10006bf..4cfd1e9 100644
--- a/llm/models.py
+++ b/llm/models.py
@@ -217,15 +217,24 @@ class Model(ABC):
         pass
 
     def get_key(self):
+        from llm import get_key
+
         if self.needs_key is None:
+            # This model doesn't use an API key
             return None
+
         if self.key is not None:
+            # Someone already set model.key='...'
             return self.key
-        if self.key_env_var is not None:
-            key = os.environ.get(self.key_env_var)
-            if key:
-                return key
 
+        # Attempt to load a key using llm.get_key()
+        key = get_key(
+            explicit_key=None, key_alias=self.needs_key, env_var=self.key_env_var
+        )
+        if key:
+            return key
+
+        # Show a useful error message
         message = "No key found - add one using 'llm keys set {}'".format(
             self.needs_key
         )

But I need to come up with a robust test plan - both manual and automated - for ensuring it does what it needs to do.

@simonw simonw added this to the 0.8 milestone Aug 18, 2023
@simonw
Copy link
Owner

simonw commented Aug 20, 2023

I can test this more easily using the OpenRouter support I added in:

@simonw
Copy link
Owner

simonw commented Aug 21, 2023

Managed to replicate this bug:

llm -m claude 'say hi and your name'                 
 Hello, my name is Claude. It's nice to meet you!

But with the environment variable set:

OPENAI_API_KEY=x llm -m claude 'say hi and your name'
Error: Invalid credentials

@simonw
Copy link
Owner

simonw commented Aug 21, 2023

This was a deliberate design decision:

llm/llm/__init__.py

Lines 114 to 118 in 341dbce

# Environment variables over-ride the default key
if env_var and os.environ.get(env_var):
return os.environ[env_var]
# Return the key stored for the default alias
return stored_keys.get(key_alias)

I'm trying to remember why.

@simonw
Copy link
Owner

simonw commented Aug 21, 2023

I'm going to change that decision.

If you want to use the key from your environment variable instead, you can do this:

llm "my prompt" --key $OPENAI_API_KEY

@simonw simonw closed this as completed in dff36f0 Aug 21, 2023
@simonw
Copy link
Owner

simonw commented Aug 21, 2023

simonw added a commit that referenced this issue Aug 21, 2023
@itsderek23
Copy link
Author

Thank you @simonw !

@cmungall
Copy link
Contributor

cmungall commented Aug 1, 2024

This is more of a note for myself if I end up searching the docs again.

When using extra-openai-models.yaml, it seems natural to assume you can do this:

- model_name: lbl/llama-3
  model_id: lbl/llama-3
  api_base: "https://api.cborg.lbl.gov"
  api_key_name: CBORG_API_KEY

and it will pick up CBORG_API_KEY from the env. In fact this doesn't happen - it needs to be done using llm keys, or on the command line as specified above.

I prefer the former, so I usually give a more conventional key name:

- model_name: lbl/llama-3
  model_id: lbl/llama-3
  api_base: "https://api.cborg.lbl.gov"
  api_key_name: cborg

and manage this using

llm keys set cborg

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants