This repository serves as a codebase for a mechanical assistant powered by LLAMA LLM which is implemented using RAG. The assistant utilizes Llamaindex and Streamlit for its functionality.
Mechanical-Assistant/
├── home.py
├── requirements.txt
└── llamaindex/
├── embeddings.py
├── indexing.py
├── llm.py
├── loading_data.py
├── main.py
└── querying.py
The main aim of this repository is to provide a mechanical assistant application that can assist users with various tasks such as question answering and summarization realted to mechanical engineering. It leverages the LLAMA LLM model and RAG implementation for its core functionality.
Mech.Assistant.Streamlit.-.Brave.2024-03-10.15-43-23.online-video-cutter.com.online-video-cut.mp4
-
Clone this repository:
git clone https://github.com/just-ctrlC-ctrlV/Mechanical-Assistant.git
-
Install the required dependencies:
pip install -r requirements.txt
-
Set up environment variables:
- Ensure you have the necessary environment variables set up, including LLAMAAPI, HUGGING_FACE_TOKEN, PINECONE_API_KEY, DB_DIMENSION, DB_INDEX_NAME, DB_METRIC, DB_ENV, and DB_REGION.
To run the Mechanical Assistant application, execute the home.py
file:
streamlit run home.py
The repository consists of the following key files:
home.py
: Contains the main Streamlit application for the Mechanical Assistant.llamaindex/
embeddings.py
: Handles setting up the embedding model.indexing.py
: Manages the creation and loading of the index.llm.py
: Connects to the LLAMA language model.loading_data.py
: Loads documents and repository data.main.py
: Main functionality for the assistant application.querying.py
: Handles querying the index with user input.
Contributions to this repository are welcome. If you have suggestions or improvements, feel free to open an issue or submit a pull request.
This repository is licensed under the MIT License.