Skip to content

Commit

Permalink
Support multiple LLM provider
Browse files Browse the repository at this point in the history
  • Loading branch information
tpai committed Feb 18, 2024
1 parent 0e0b164 commit 500d019
Show file tree
Hide file tree
Showing 3 changed files with 39 additions and 17 deletions.
43 changes: 33 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,6 @@

An AI-powered text summarization Telegram bot that generates concise summaries of text, URLs, PDFs and YouTube videos.

**⚠️Free credits has expired at 1 Oct 2023⚠️**

- EN Bot: ~~https://t.me/summarygptenbot~~ (retired)
- 繁中 Bot: ~~https://t.me/summarygptzhtwbot~~ (retired)

> Thanks for using, feel free to self-host your own summary bot.
## Features
Expand All @@ -18,16 +13,44 @@ An AI-powered text summarization Telegram bot that generates concise summaries o

## Usage

Launch your own GPT-4 summary bot with 32k token context in one line command 🚀
Launch a GPT-4 summary bot using OpenAI.

```sh
docker run -d -e TELEGRAM_TOKEN=$YOUR_TG_TOKEN -e OPENAI_API_KEY=$YOUR_API_KEY -e OPENAI_MODEL=gpt-4-32k -e CHUNK_SIZE=20000 -e TS_LANG=$YOUR_LANGUAGE tonypai/summary-gpt-bot:latest
docker run -d \
-e LLM_MODEL=gpt-4 \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-e TELEGRAM_TOKEN=$YOUR_TG_TOKEN \
-e TS_LANG=$YOUR_LANGUAGE \
tonypai/summary-gpt-bot:latest
```

Launch a summary bot using Azure OpenAI.

```sh
docker run -d \
-e AZURE_API_BASE=https://<your_azure_resource_name>.openai.azure.com \
-e AZURE_API_KEY=$AZURE_API_KEY \
-e AZURE_API_VERSION=2024-02-15-preview \
-e LLM_MODEL=azure/<your_deployment_name> \
-e TELEGRAM_TOKEN=$YOUR_TG_TOKEN \
-e TS_LANG=$YOUR_LANGUAGE \
tonypai/summary-gpt-bot:latest
```

LLM Variables

| Environment Variable | Description |
|----------------------|-------------|
| AZURE_API_BASE | API URL base for AZURE OpenAI API |
| AZURE_API_KEY | API key for AZURE OpenAI API |
| AZURE_API_VERSION | API version for AZURE OpenAI API |
| OPENAI_API_KEY | API key for OpenAI API |

Bot Variables

| Environment Variable | Description |
|----------------------|-------------|
| TELEGRAM_TOKEN | Token for Telegram API (required) |
| OPENAI_API_KEY | API key for OpenAI GPT API (required) |
| OPENAI_MODEL | Model to use for text summarization (default: gpt-3.5-turbo-16k) |
| CHUNK_SIZE | The maximum token of a chunk when receiving a large input (default: 10000) |
| LLM_MODEL | LLM Model to use for text summarization (default: gpt-3.5-turbo-16k) |
| TELEGRAM_TOKEN | Token for Telegram API (required) |
| TS_LANG | Language of the text to be summarized (default: Taiwanese Mandarin) |
9 changes: 4 additions & 5 deletions main.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
import asyncio
import openai
import os
import re
import trafilatura
from litellm import completion
from duckduckgo_search import AsyncDDGS
from PyPDF2 import PdfReader
from concurrent.futures import ThreadPoolExecutor
Expand All @@ -12,8 +12,7 @@
from youtube_transcript_api import YouTubeTranscriptApi

telegram_token = os.environ.get("TELEGRAM_TOKEN", "xxx")
apikey = os.environ.get("OPENAI_API_KEY", "xxx")
model = os.environ.get("OPENAI_MODEL", "gpt-3.5-turbo-16k")
model = os.environ.get("LLM_MODEL", "gpt-3.5-turbo-16k")
lang = os.environ.get("TS_LANG", "Taiwanese Mandarin")
chunk_size= int(os.environ.get("CHUNK_SIZE", 10000))

Expand Down Expand Up @@ -134,8 +133,8 @@ def call_gpt_api(prompt, additional_messages=[]):
Call GPT API
"""
try:
openai.api_key = apikey
response = openai.ChatCompletion.create(
response = completion(
# response = openai.ChatCompletion.create(
model=model,
messages=additional_messages+[
{"role": "user", "content": prompt}
Expand Down
4 changes: 2 additions & 2 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ asyncio==3.4.3
# progress tracking
tqdm==4.66.1

# natural language processing, newer versions wont work
openai==0.28
# llm adapter
litellm==1.21.7

# text extraction
trafilatura==1.7.0
Expand Down

0 comments on commit 500d019

Please sign in to comment.