OpenCopilot allows you to have your own product's AI copilot. It integrates with your underlying APIs and is able to execute API calls whenever needed. It uses LLMs to determine if the user's request requires calling an API endpoint. Then, it decides which endpoint to call and passes the appropriate payload based on the given API definition.
- Provide your API/backend definition, including your public endpoints and how to call them. Currently, OpenCopilot supports Swagger OpenAPI 3.0. We're also working on a UI to allow you to dynamically add endpoints.
- OpenCopilot validates your schema to achieve the best results.
- We feed the API definition to an LLM.
- Finally, you can integrate our user-friendly chat bubble into your SaaS app.
You can try it out on opencopilot.so
final.1.mp4
- Shopify is developing "Shopify Sidekick."
- Microsoft is working on "Windows Copilot."
- GitHub is in the process of creating "GitHub Copilot."
- Microsoft is also developing "Bing Copilot."
And our goal is to empower every SaaS product with the ability to have their own AI copilots tailored for their unique products.
- It is capable of calling your underlying APIs.
- It can transform the response into meaningful text.
- It can automatically populate certain request payload fields based on the context.
- For instance, you can request actions like: "Initiate a new case about X problem," and the title field will be automatically filled with the appropriate name.
- Currently, it does not support calling multiple endpoints simultaneously (feature coming soon).
- It is not suitable for handling large APIs.
- It is not equipped to handle complex APIs.
- It can not remember the chat history (every message is agnostic from previous messages.)
- Create unlimited copilots.
- Embed the copilot on your SaaS product using standard JS calls.
- TypeScript chat bubble.
- Provide Swagger definitions for your APIs.
- Swagger definition validator + recommender.
- [in progress] UI endpoints editor.
- Chat memory.
- Vector DB support for large Swagger files.
- Plugins system to support different types of authentications.
- Offline LLMs.
- Ability to ingest text data, PDF files, websites, and extra data sources.
We love hearing from you! Got any cool ideas or requests? We're all ears! So, if you have something in mind, give us a shout!
-
Make sure you have docker installed.
-
To begin, clone this Git repository:
git clone git@github.com:openchatai/OpenCopilot.git
- Update llm-server/Dockerfile with your
OPENAI_API_KEY
key:
ENV OPENAI_API_KEY YOUR_TOKEN_HERE
- Navigate to the repository folder and run the following command (for MacOS or Linux):
make install
Once the installation is complete, you can access the OpenCopilot console at: http://localhost:8000
Documentation available here
This project follows the all-contributors specification. Contributions of any kind welcome!