Skip to content

Latest commit

 

History

History
89 lines (65 loc) · 2.03 KB

LOCAL_DEPLOYMENT.md

File metadata and controls

89 lines (65 loc) · 2.03 KB

Running the Slackbot locally

Installation from source code

The source code distribution uses Poetry

Install dependencies

poetry install

Create the embedding database from a Google Drive

Create and store content

Create a Google Drive folder and safe all the documents in it. Please note, currently, shortcuts are not supported

Grant access

Make sure the service account you created has access to the folder you would like to ingest

Environment Variables

  • Get the ID of the folder you would like to use for ingestion and set the environment variable
    export GOOGLE_DRIVE_FOLDER_ID=< Folder ID >
  • Point the application to the service account file
    export SERVICE_KEY_FILE=< Path to the service key json file >

Run the ingest script

./knowledge_base_gpt/apps/ingest/ingest.py

Run a local Redis

The slackbot relies on Redis for storage.

You can use Podman and Quadlet to run it locally.

  1. Create the file ~/.config/containers/systemd/redis.container
    [Container]
    Image=docker.io/redis/redis-stack:latest
    PublishPort=6379:6379
    PublishPort=8001:8001
    ContainerName=redis
    
  2. Reload the systemd daemon
    systemctl --user daemon-reload
  3. Start the service
    systemctl --user start redis.service

Run the proxy to the Ollama LLM

TBD

Run the Slack Bot backend

Environment Variables

  • Get the Bot and Application tokens and set the corresponding environment variables
    export SLACK_BOT_TOKEN=<Bot Token>
    export SLACK_APP_TOKEN=<Application Token>
  • Set the name of the slack channel to forward unanswered questions to
    export FORWARD_QUESTION_CHANNEL_NAME=<Channel Name>
  • Set the location to store the chat logs
    export CHAT_LOGS_FILE=<Path to the chat log file>

Run the Slack Bot backend

python -m knowledge_base_gpt.apps.slackbot