Skip to content

Latest commit

 

History

History
83 lines (57 loc) · 2.34 KB

README.md

File metadata and controls

83 lines (57 loc) · 2.34 KB

Groq Chat 🚀💬

Groq Chat is a lightning-fast, browser-based chat interface for language models powered by Groq's LPU (Language Processing Unit). Experience ChatGPT-like conversations using Meta's LLAMA 3.1 series, with enhanced privacy and productivity features.

🌟 Features

  • 🧠 Powered by Groq's LPU for ultra-fast language model inference
  • 🤖 Access to Meta's LLAMA 3.1 series models (405B, 70B, 8B parameters)
  • 🔒 Privacy-focused: Runs entirely in your browser, no server-side data storage
  • 🌐 RAG support for web page URLs: Attach and crawl web pages for context-aware conversations
  • 🎙️ Speech-to-text functionality for voice interactions
  • 📝 Edit messages and branch conversations
  • 💾 Save and manage conversation history locally
  • 🗑️ Delete conversations as needed
  • 🔗 No login required - just bring your Groq API key

🚀 Getting Started

Online Version

Visit https://groqchat-three.vercel.app/ to use Groq Chat online.

Local Setup

To run Groq Chat locally:

  1. Clone the repository:

    git clone https://github.com/yourusername/groq-chat.git
    cd groq-chat
    
  2. Install dependencies:

    npm install
    
  3. Run the development server:

    npm run dev
    
  4. Open http://localhost:3000 in your browser.

🔧 Usage

  1. Enter your Groq API key when prompted.
  2. Start chatting with the language model of your choice.
  3. Use the URL attachment feature to add context from web pages.
  4. Utilize speech-to-text for voice interactions.
  5. Edit, branch, save, or delete conversations as needed.

🛠️ Tech Stack

  • Next.js
  • Vercel for deployment
  • IndexedDB for local storage
  • Groq API for language model inference

🔜 Upcoming Features

  • Multi-modality support (when available from Groq)
  • Custom JavaScript macros for enhanced functionality
  • File attachment support (PDF, documents)
  • Auto-formatting options

🤝 Contributing

This is an open-source project. Contributions, issues, and feature requests are welcome!

📄 License

MIT License

🙏 Acknowledgements

  • Groq for their incredible LPU technology
  • Meta for the LLAMA 3.1 series models
  • Vercel for their excellent hosting and deployment services

Built with ❤️ by Unclecode (Follow me on X).