Skip to content

HelgeSverre/ollama-gui

Repository files navigation

Ollama GUI logo

Ollama GUI

A modern web interface for chatting with your local LLMs through Ollama

Powered by Ollama MIT License Live Demo

✨ Features

  • 🖥️ Clean, modern interface for interacting with Ollama models
  • 💾 Local chat history using IndexedDB
  • 📝 Full Markdown support in messages
  • 🌙 Dark mode support
  • 🚀 Fast and responsive
  • 🔒 Privacy-focused: All processing happens locally

🚀 Quick Start

Prerequisites

  1. Install Ollama
  2. Install Node.js (v16+) and Yarn

Local Development

# Start Ollama server with your preferred model
ollama pull mistral  # or any other model
ollama serve

# Clone and run the GUI
git clone https://github.com/HelgeSverre/ollama-gui.git
cd ollama-gui
yarn install
yarn dev

Using the Hosted Version

To use the hosted version, run Ollama with:

OLLAMA_ORIGINS=https://ollama-gui.vercel.app ollama serve

Docker Deployment

# Build the image
docker build -t ollama-gui .

# Run the container
docker run -p 8080:80 ollama-gui

# Access at http://localhost:8080

🛣️ Roadmap

  • Chat history with IndexedDB
  • Markdown message formatting
  • Code cleanup and organization
  • Model library browser and installer
  • Mobile-responsive design
  • File uploads with OCR support

🛠️ Tech Stack

📄 License

Released under the MIT License.

About

A Web Interface for chatting with your local LLMs via the ollama API

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published