🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
-
Updated
May 9, 2024 - Python
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
A versatile CLI and Python wrapper for Perplexity's suite of large language models including their flagship Chat and Online 'Sonar Llama-3' models along with `LLama-3 and 'Mixtral'. Streamline the creation of chatbots, and search the web with AI (in real-time) with ease.
XMPP Bot designed for privacy AI language model interactions
AI Voice-Powered TODO app
Welcome to the Mixtral 8x7B offloading demo repository! This project aims to demonstrate the seamless execution of Mixtral-8x7B models on Colab or consumer desktops.
A Python module for running the Mixtral-8x7B language model with customisable precision and attention mechanisms.
Crew of AI Agents that investigate a company to help you prepare for your next interview
This project aims to build a RAG model to chat with your PDFs
working on llm research
Add a description, image, and links to the mixtral-8x7b-instruct topic page so that developers can more easily learn about it.
To associate your repository with the mixtral-8x7b-instruct topic, visit your repo's landing page and select "manage topics."