A quick-start template using the OpenAI Assistants API with Next.js.
git clone https://github.com/openai/openai-assistants-quickstart.git
cd openai-assistants-quickstart
2. Set your OpenAI API key
export OPENAI_API_KEY="sk_..."
(or in .env.example
and rename it to .env
).
npm install
npm run dev
5. Navigate to http://localhost:3000.
You can deploy this project to Vercel or any other platform that supports Next.js.
This project is intended to serve as a template for using the Assistants API in Next.js with streaming, tool use (code interpreter and file search), and function calling. While there are multiple pages to demonstrate each of these capabilities, they all use the same underlying assistant with all capabilities enabled.
The main logic for chat will be found in the Chat
component in app/components/chat.tsx
, and the handlers starting with api/assistants/threads
(found in api/assistants/threads/...
). Feel free to start your own project and copy some of this logic in! The Chat
component itself can be copied and used directly, provided you copy the styling from app/components/chat.module.css
as well.
- Basic Chat Example: http://localhost:3000/examples/basic-chat
- Function Calling Example: http://localhost:3000/examples/function-calling
- File Search Example: http://localhost:3000/examples/file-search
- Full-featured Example: http://localhost:3000/examples/all
app/components/chat.tsx
- handles chat rendering, streaming, and function call forwardingapp/components/file-viewer.tsx
- handles uploading, fetching, and deleting files for file search
api/assistants
-POST
: create assistant (only used at startup)api/assistants/threads
-POST
: create new threadapi/assistants/threads/[threadId]/messages
-POST
: send message to assistantapi/assistants/threads/[threadId]/actions
-POST
: inform assistant of the result of a function it decided to callapi/assistants/files
-GET
/POST
/DELETE
: fetch, upload, and delete assistant files for file search
Let us know if you have any thoughts, questions, or feedback in this form!