A Rust API for interacting with the OpenAI chat completion API.
hjowdy is a simple Rust API designed to interact with the OpenAI chat API. The API stores chat history and messages in a PostgreSQL database.
- Clone the repository
git clone https://github.com/example/hjowdy.git
-
Install Rust and Cargo from the official website.
-
Follow the Database and Environment Variables section, provided below, to set up PostgreSQL and configure necessary environment variables.
-
Compile and run the API server
cargo run
hjowdy uses PostgreSQL as the database to store chat and message history. To set it up, follow these steps:
-
Install PostgreSQL on your system. Download and installation instructions
-
Ensure PostgreSQL is running.
-
Set up the required PostgreSQL credentials and OpenAI API key in the
.env
file:
Copy .env.example
to .env
and fill in the necessary variables
SERVER_ADDR=127.0.0.1:8080
PG.USER=<Your PostgreSQL username>
PG.PASSWORD=<Your PostgreSQL password>
PG.HOST=<Your PostgreSQL host>
PG.PORT=<Your PostgreSQL port>
PG.DBNAME=<Your PostgreSQL database name>
PG.POOL.MAX_SIZE=<Your PostgreSQL max pool size>
OPENAI_API_KEY=<Your OpenAI API key>
- Run the
setup_database.sh
script to create thechathistory
database and necessary tables:
chmod +x setup_database.sh
./setup_database.sh
Refer to the Example CURLs section below for some examples of how to make requests to the API.
Here are some example CURL requests to help you get started:
- Create a chat
curl -X POST "http://localhost:8080/create_chat/1"
- Send a message to the chat
curl -X POST "http://localhost:8080/chat/1" \
-H "Content-Type: application/json" \
-d '{"messages": [{"role": "user", "content": "Hello!"}]}'
POST /create_chat/{app_user}
- Creates a new chatGET /chats/{app_user}
- Retrieves all chats for the specified userPOST /chat/{chat_id}
- Sends a message and retrieves the chatbot responseGET /chats/{chat_id}/messages
- Retrieves all messages in a chatPUT /update_chat_name
- Updates the chat nameDELETE /delete_chat/{chat_id}
- Deletes a chat
To interact with the API, send JSON payloads in the request body. For example, when calling the /chat/{chat_id}
endpoint, the request body should look like this:
{
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Who won the world series in 2020?"
}
]
}
The API will return a JSON object containing the chatbot's response. In the example above, the response might look like this:
{
"id": "chatcmpl-6p9XYPYSTTRi0xEviKjjilqrWU2Ve",
"object": "chat.completion",
"created": 1677649420,
"model": "gpt-3.5-turbo-0301",
"usage": {
"prompt_tokens": 56,
"completion_tokens": 31,
"total_tokens": 87
},
"choices": [
{
"message": {
"role": "assistant",
"content": "The Los Angeles Dodgers won the World Series in 2020."
},
"finish_reason": "stop",
"index": 0
}
]
}
- Fork the repository 🍴
- Create a new branch with your feature or bugfix 🌿
- Commit changes with descriptive commit messages 📝
- Push your branch to the remote fork 🔌
- Submit a pull request back to the original repository 🤲
For more information, check out the OpenAI API documentation.
Good luck! 🌄