Skip to content

A native macOS app for chatting with local LLMs

Notifications You must be signed in to change notification settings

sheshbabu/Chital

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 

Repository files navigation

Chital

A native macOS app for chatting with Ollama models

Features

  • Low memory usage and fast app launch times
  • Support for multiple chat threads
  • Switch between different models
  • Markdown support
  • Automatic chat thread title summarization
Screen.Recording.2024-09-29.at.6.24.53.PM.mov

Requirements

  • macOS 14 Sonoma and above
  • Ensure Ollama is installed
  • Ensure atleast one LLM model is downloaded

Installation

  • Download Chital
  • Move Chital.app from the Downloads folder into the Applications folder.
  • Goto System Settings -> Privacy & Security -> click Open Anyway
Screenshot 2024-09-29 at 10 35 50 AM

Configuration

The following settings can be changed from Chital > Settings:

  • Default model
  • Ollama base URL
  • Context window length
  • Chat thread title summarization prompt

Keyboard Shortcuts

  • Command + N New chat thread
  • Option + Enter Multiline input

Contributions

I built this application mainly for my own personal use. Feel free to fork this codebase to add features. I might not have time to look at the PRs and bug tickets.

License

MIT