Skip to content

trendy-design/llmchat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Screenshot 2024-09-25 at 8 52 53 AM

LLMChat logo

Most intuitive All-in-one AI chat interface.

Key Features

  • 🧠 Multiple LLM Providers: Supports various language models, including Ollama.
  • 🔌 Plugins Library: Enhance functionality with an expandable plugin system, including function calling capabilities.
  • 🌐 Web Search Plugin: Allows AI to fetch and utilize real-time web data.
  • 🤖 Custom Assistants: Create and tailor AI assistants for specific tasks or domains.
  • 🗣️ Text-to-Speech: Converts AI-generated text responses to speech using Whisper.
  • 🎙️ Speech-to-Text: (Coming soon) Enables voice input for more natural interaction.
  • 💾 Local Storage: Securely store data locally using in-browser IndexedDB for faster access and privacy.
  • 📤📥 Data Portability: Easily import or export chat data for backup and migration.
  • 📚 Knowledge Spaces: (Coming soon) Build custom knowledge bases for specialized topics.
  • 📝 Prompt Library: Use pre-defined prompts to guide AI conversations efficiently.
  • 👤 Personalization: Memory plugin ensures more contextual and personalized responses.
  • 📱 Progressive Web App (PWA): Installable on various devices for a native-like app experience.

Tech Stack

  • 🌍 Next.js
  • 🔤 TypeScript
  • 🗂️ Pglite
  • 🧩 LangChain
  • 📦 Zustand
  • 🔄 React Query
  • 🗄️ Supabase
  • 🎨 Tailwind CSS
  • Framer Motion
  • 🖌️ Shadcn
  • 📝 Tiptap

Roadmap

  • 🎙️ Speech-to-Text: Coming soon.
  • 📚 Knowledge Spaces: Coming soon.

Quick Start

To get the project running locally:

Prerequisites

  • Ensure you have yarn or bun installed.

Installation

  1. Clone the repository:

    git clone https://github.com/your-repo/llmchat.git
    cd llmchat
  2. Install dependencies:

    yarn install
    # or
    bun install
  3. Start the development server:

    yarn dev
    # or
    bun dev
  4. Open your browser and navigate to http://localhost:3000.

og_6x

Deployment

Instructions for deploying the project will be added soon.