Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python library support for adding aliases #154

Closed
simonw opened this issue Aug 12, 2023 · 6 comments
Closed

Python library support for adding aliases #154

simonw opened this issue Aug 12, 2023 · 6 comments
Labels
enhancement New feature or request
Milestone

Comments

@simonw
Copy link
Owner

simonw commented Aug 12, 2023

I'm going to depend on LLM 0.7 or higher and have the download-model ... --alias llama2 option work by writing to the aliases.json file.

Originally posted by @simonw in simonw/llm-mlc#4 (comment)

This would make sense as a Python API method that plugins can use.

Maybe this:

from llm import set_alias

set_alias("llama2", "... model ID ...")
@simonw simonw added the enhancement New feature or request label Aug 12, 2023
@simonw
Copy link
Owner Author

simonw commented Aug 13, 2023

After implementing this I should upgrade the various plugins that have their own alias mechanism to use this instead.

@simonw simonw added this to the 0.8 milestone Aug 18, 2023
@simonw
Copy link
Owner Author

simonw commented Aug 20, 2023

llm aliases --help
llm aliases
Usage: llm aliases [OPTIONS] COMMAND [ARGS]...

  Manage model aliases

Options:
  --help  Show this message and exit.

Commands:
  list    List current aliases
  path    Output the path to the aliases.json file
  remove  Remove an alias
  set     Set an alias for a model

I should add llm.remove_alias(name) too. It can raise KeyError if invalid.

And I'll make llm aliases run llm aliases list by default, refs:

simonw added a commit that referenced this issue Aug 20, 2023
@simonw
Copy link
Owner Author

simonw commented Aug 20, 2023

Here's the current implementation of the aliases_set() CLI command:

llm/llm/cli.py

Lines 715 to 724 in 0cd9333

path = user_dir() / "aliases.json"
path.parent.mkdir(parents=True, exist_ok=True)
if not path.exists():
path.write_text("{}\n")
try:
current = json.loads(path.read_text())
except json.decoder.JSONDecodeError as ex:
raise click.ClickException("aliases.json is invalid: {}".format(ex))
current[alias] = model_id
path.write_text(json.dumps(current, indent=4) + "\n")

It catches JSONDecodeError and turns that into ClickException - which makes sense for a CLI tool but isn't the right thing for a Python library method to do.

What SHOULD the Python library do if you call llm.set_alias() but it turns out the aliases.json file is invalid? Which isn't something that should ever happen.

Two options:

  1. Raise an error, and hope that the caller is ready for that.
  2. Discard the current aliases.json file and write a new, valid one

I'm very tempted to implement 2 - a broken aliases.json file is useless, and we want the set_alias() function to succeed.

@simonw
Copy link
Owner Author

simonw commented Aug 20, 2023

I'm going for that - if the JSON is invalid it rewrites it.

simonw added a commit that referenced this issue Aug 20, 2023
simonw added a commit that referenced this issue Aug 20, 2023
@simonw
Copy link
Owner Author

simonw commented Aug 20, 2023

Now to refactor some code to use this.

@simonw
Copy link
Owner Author

simonw commented Aug 20, 2023

simonw added a commit that referenced this issue Aug 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant