A modern web interface for chatting with your local LLMs through Ollama
- 🖥️ Clean, modern interface for interacting with Ollama models
- 💾 Local chat history using IndexedDB
- 📝 Full Markdown support in messages
- 🌙 Dark mode support
- 🚀 Fast and responsive
- 🔒 Privacy-focused: All processing happens locally
# Start Ollama server with your preferred model
ollama pull mistral # or any other model
ollama serve
# Clone and run the GUI
git clone https://github.com/HelgeSverre/ollama-gui.git
cd ollama-gui
yarn install
yarn dev
To use the hosted version, run Ollama with:
OLLAMA_ORIGINS=https://ollama-gui.vercel.app ollama serve
# Build the image
docker build -t ollama-gui .
# Run the container
docker run -p 8080:80 ollama-gui
# Access at http://localhost:8080
- Chat history with IndexedDB
- Markdown message formatting
- Code cleanup and organization
- Model library browser and installer
- Mobile-responsive design
- File uploads with OCR support
- Vue.js - Frontend framework
- Vite - Build tool
- Tailwind CSS - Styling
- VueUse - Vue Composition Utilities
- @tabler/icons-vue - Icons
- Design inspired by LangUI
- Hosted on Vercel
Released under the MIT License.