This repository provides a Streamlit web application that runs the DeepSeek-R1 model locally. The app allows users to input text prompts and receive AI-generated responses from the DeepSeek R1 model in an interactive and user-friendly interface.
- Local Execution: Runs the DeepSeek R1 model on your machine.
- Streamlit UI: A lightweight and interactive web-based interface.
- Real-time Inference: Get responses from the DeepSeek R1 model instantly.
- Easy Setup: Simple installation and execution steps.
Ensure you have the following installed:
- Python 3.8+
- pip
- DeepSeek R1 Model (local installation)
- Virtual environment (optional but recommended)
git clone https://github.com/yashksaini-coder/streamlit-deepseek-r1.git
cd streamlit-deepseek-r1
python -m venv venv
source venv/bin/activate # On Windows use `venv\Scripts\activate`
pip install -r requirements.txt
Ensure that you have downloaded and installed the DeepSeek R1 model locally. Follow the official DeepSeek R1 installation guide: DeepSeek AI.
Once everything is set up, start the Streamlit app:
streamlit run app.py
This will launch the app in your default web browser.
- Open the app in your browser.
- Enter a text prompt in the input box.
- Click the Submit button to generate a response.
- View the AI-generated response on the screen.
DeepSeek-R1-demo/
├── .python-version
├── .venv/
│ ├── .gitignore
│ ├─] CACHEDIR.TAG (ignored)
│ ├─] etc/ (ignored)
│ ├─] include/ (ignored)
│ ├─] Lib/ (ignored)
│ ├─] pyvenv.cfg (ignored)
│ ├─] Scripts/ (ignored)
│ └─] share/ (ignored)
├── LICENSE
├── main.py
├── pyproject.toml
├── README.md
├── requirements.txt
└── uv.lock
Ensure the requirements.txt
contains:
streamlit
langchain_core
langchain_community
langchain_ollama
Feel free to submit issues, feature requests, or contribute via pull requests!
This project is licensed under the GNU GENERAL PUBLIC LICENSE.