Skip to content

Friendly interface to chat with an Ollama instance.

License

Apache-2.0 and 2 other licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT
Unlicense
LICENSE-UNLICENSE
Notifications You must be signed in to change notification settings

zeozeozeo/ellama

🦙 Ellama Ellama Stars

Ellama is a friendly interface to chat with a local or remote Ollama instance.

Ellama, a friendly Ollama interface, running LLaVA

🦙 Features

  • Chat History: create, delete and edit model settings per-chat.
  • Multimodality: easily use vision capabilities of any multimodal model, such as LLaVA.
  • Ollama: no need to install new inference engines, connect to a regular Ollama instance instead.
  • Resource Efficient: minimal RAM and CPU usage.
  • Free: no need to buy any subscriptions or servers, just fire up a local Ollama instance.

🦙 Quickstart

  1. Download the latest Ellama release from the Releases page.
    • or build & install from source:
      $ git clone https://github.com/zeozeozeo/ellama.git
      $ cd ellama
      $ cargo install --path .
  2. In the Settings ⚙️ tab, change the Ollama host if needed (default is http://127.0.0.1:11434)
  3. In the same tab, select a model that will be used for new chats by default. Ellama will try to select the best model on the first run.
  4. Close the Settings tab, create a new chat by pressing the "➕ New Chat" button, and start chatting!
  5. To add images, click the ➕ button next to the text field, drag them onto Ellama's window, or paste them from your clipboard.

🦙 Gallery

GASLIGHT.MP4

Ellama's greeting screen

LLaVA counting people, in Ellama

Ellama's settings panel

Ellama's chat edit panel

🦙 Wishlist

These features are not yet present in Ellama, but they would be nice to have:

  • Support OpenAI-Compatible APIs: currently only has Ollama support
  • A "Notes" section, where you can edit and write LLM-assisted notes
  • Publish on crates.io: currently still relies on some git dependencies

License

Unlicense OR MIT OR Apache-2.0