Blue Shell is a AI Chat Shell for local service. 0.0.1 version support ollama
pip install blueshell
At simplest, run
python -m blueshell.shell -m "codellama"
If ollama isn't listening default port, for example 11435. we could pass a url parameter like this:
python -m blueshell.shell -m "codellama" --url http://127.0.0.1:11435
More options could run help:
$ python -m blueshell.shell --help
usage: Blue Shell [-h] [--url URL] [-p PROMPT] [-m MODEL]
[-f {markdown,plain,json}] [-s SYSTEM]
A AI assistant for local ai service
options:
-h, --help show this help message and exit
--url URL
-p PROMPT, --prompt PROMPT
-m MODEL, --model MODEL
-f {markdown,plain,json}, --format {markdown,plain,json}
-s SYSTEM, --system SYSTEM
Powered By Python
You can list all models in ollama:
$ python -m blueshell.list
List has a option is url:
$ python -m blueshell.list --url http://127.0.0.1:11435
support ollama
document typo
print feedback as markdown
fixed dependencies miss
- add list command
- add format argument
- C-c interrupt repl and continue
- Improved User Experience
support json format pretty