Gollama is a tool for managing Ollama models.
It provides a TUI (Text User Interface) for listing, inspecting, deleting, copying, and pushing Ollama models as well as optionally linking them to LM Studio.
The application allows users to interactively select models, sort them by various criteria, and perform actions on them using hotkeys.
The project started off as a rewrite of my llamalink project, but I decided to expand it to include more features and make it more user-friendly.
It's in the early stages of development, so there are plenty of bugs and missing features, but I'm already finding it useful for managing my models, especially for cleaning up old models.
- Interactive TUI with sorting and filtering capabilities.
- List available models and display basic metadata such as size, quantization level, model family, and modified date.
- Sort models by name, size, modification date, quantization level, and family.
- Select and delete models.
- Inspect model for additional details.
- Link models to LM Studio.
- Copy models.
- Push models to a registry.
- Show running models.
- Plenty more comings soon if I continue to find the tool useful.
From go:
go install github.com/sammcj/gollama@HEAD
From Github:
Download the most recent release from the releases page and extract the binary to a directory in your PATH.
e.g. zip -d gollama*.zip -d gollama && mv gollama /usr/local/bin
To run the gollama
application, use the following command:
gollama
Tip: I like to alias gollama to g
for quick access:
echo "alias g=gollama" >> ~/.zshrc
Space
: SelectEnter
: Run model (Ollama run)i
: Inspect modelt
: Top (show running models) (Work in progress)D
: Delete modelc
: Copy modelr
: Rename model (Work in progress)u
: Update model (edit Modelfile) (Work in progress)U
: Unload all modelsP
: Push modeln
: Sort by names
: Sort by sizem
: Sort by modifiedk
: Sort by quantizationf
: Sort by familyl
: Link model to LM StudioL
: Link all models to LM Studioq
: Quit
-l
: List all available Ollama models and exit-ollama-dir
: Custom Ollama models directory-lm-dir
: Custom LM Studio models directory-no-cleanup
: Don't cleanup broken symlinks-cleanup
: Remove all symlinked models and empty directories and exit-u
: Unload all running models-v
: Print the version and exit
Gollama can also be called with -l
to list models without the TUI.
gollama -l
List (gollama -l
):
Inspect (i
)
Top (t
)
Gollama uses a JSON configuration file located at ~/.config/gollama/config.json
. The configuration file includes options for sorting, columns, API keys, log levels etc...
Example configuration:
{
"default_sort": "modified",
"columns": [
"Name",
"Size",
"Quant",
"Family",
"Modified",
"ID"
],
"ollama_api_key": "",
"ollama_api_url": "http://localhost:11434",
"lm_studio_file_paths": "",
"log_level": "info",
"log_file_path": "/Users/username/.config/gollama/gollama.log",
"sort_order": "Size",
"strip_string": "my-private-registry.internal/",
"editor": "",
"docker_container": ""
}
strip_string
can be used to remove a prefix from model names as they are displayed in the TUI. This can be useful if you have a common prefix such as a private registry that you want to remove for display purposes.docker_container
- experimental - if set, gollama will attempt to perform any run operations inside the specified container.editor
- experimental - if set, gollama will use this editor to open the Modelfile for editing.
-
Clone the repository:
git clone https://github.com/sammcj/gollama.git cd gollama
-
Build:
go get make build
-
Run:
./gollama
Logs can be found in the gollama.log
which is stored in $HOME/.config/gollama/gollama.log
by default.
The log level can be set in the configuration file.
Contributions are welcome! Please fork the repository and create a pull request with your changes.
Copyright © 2024 Sam McLeod
This project is licensed under the MIT License. See the LICENSE file for details.
Thank you to folks such as Matt Williams for giving this a shot and providing feedback.