This repository provides a setup for integrating OLLAMA, OPEN-WEBUI, PIPELINES, and LANGFUSE using Docker. Follow the steps below to get everything up and running.
- Docker and required GPU drivers installed on your system.
-
Clone this repository:
git clone https://github.com/karaketir16/openwebui-langfuse.git cd openwebui-langfuse
-
Run the setup script:
./run-compose.sh
or
docker compose up -d # default driver is nvidia
-
Documentation
- You can find up-to-date documentation here.
-
Download the
langfuse_filter_pipeline.py
file:- You can download latest file from here, or use the provided file in
example/langfuse_filter_pipeline.py
.
- You can download latest file from here, or use the provided file in
-
Access Langfuse:
- Open your browser and go to
http://localhost:4000
.
- Open your browser and go to
-
Create an Admin Account and Project:
- Create an admin account and then create a project.
- Go to Project Settings and create an API key.
- Retrieve the secret key and public key.
-
Update the Pipeline Script:
- Replace
your-secret-key-here
with your secret key,your-public-key-here
with your public key, andhttps://cloud.langfuse.com
withhttp://langfuse:4000
inlangfuse_filter_pipeline.py
. - Note: If you are using the example in this repository, replace the secret key and public key with your own keys.
- Replace
-
Access Open-WebUI:
- Open your browser and go to
http://localhost:3000
.
- Open your browser and go to
-
Create an Admin Account:
- Create an admin account.
-
Upload the Pipeline Script:
- Navigate to
Settings -> Admin Settings -> Pipelines
. - In the
Upload Pipeline
section, select thelangfuse_filter_pipeline.py
file and click the upload button.
- Navigate to
-
Monitor Usage:
- You can now monitor Open-WebUI usage statistics from Langfuse.
-
Access Open-WebUI:
- Open your browser and go to
http://localhost:3000
.
- Open your browser and go to
-
Create an Admin Account:
- Create an admin account.
-
Pull Models:
- Navigate to
Settings -> Admin Settings -> Models
. - Enter a model tag to pull from the Ollama library (e.g.,
phi3:mini
). - Press the pull button.
- Navigate to