Get 1:1 support, Join the community (NEW!!)
Tip
If you are looking for a managed version of OpenCopilot, check out the cloud version - it's production-ready with our latest planning engine that can handle and understand complex user requests.
Documentation available here
OpenCopilot allows you to have your own product's AI copilot. It integrates with your underlying APIs and can execute API calls whenever needed. It uses LLMs to determine if the user's request requires calling an API endpoint. Then, it decides which endpoint to call and passes the appropriate payload based on the given API definition.
- Provide your APIs/actions definition, including your public endpoints and how to call them. Currently, OpenCopilot supports Swagger OpenAPI 3.0 for bulk import.
- OpenCopilot validates your schema to achieve the best results.
- Finally, you can integrate our user-friendly chat bubble into your SaaS app.
-
Make sure you have docker installed.
-
To begin, clone this Git repository:
git clone [email protected]:openchatai/OpenCopilot.git
In the .env
file located in the llm-server
directory, make sure to replace the placeholder value for the OPENAI_API_KEY
variable with your actual token:
OPENAI_API_KEY=YOUR_TOKEN_HERE
To install the necessary dependencies and set up the environment for OpenCopilot, use the following command:
make install
If you are using an ARM machine, specifically Mac Silicon, use the following command to install dependencies and set up the environment:
make install-arm
Once the installation is complete, you can access the OpenCopilot console at http://localhost:8888.
- make migrate: Run Alembic migrations.
- make down: Stop and remove all containers.
- make exec-dashboard: Access the dashboard container's shell.
- make exec-llm-server: Access the llm-server container's shell.
- make restart: Restart all containers.
- make logs: Show container logs.
- make purge: Fully clean uninstall (remove containers, networks, volumes, .env).
- make help: Display help message with available targets.
This will install the necessary dependencies and set up the environment for the OpenCopilot project.
Once the installation is complete, you can access the OpenCopilot console at http://localhost:8888
You can try it out on opencopilot.so
(OpenCopilot is not affiliated with Shopify, and they do not use OpenCopilot, it's just a demo of what copilots are capable of)
- Shopify is developing "Shopify Sidekick."
- Microsoft is working on "Windows Copilot."
- GitHub is in the process of creating "GitHub Copilot."
- Microsoft is also developing "Bing Copilot."
Our goal is to empower every SaaS product with the ability to have their own AI copilots tailored for their unique products.
- It is capable of calling your underlying APIs.
- It can transform the response into meaningful text.
- It can automatically populate certain request payload fields based on the context.
- For instance, you can request actions like: "Initiate a new case about X problem," and the title field will be automatically filled with the appropriate name.
- It is not suitable for handling large APIs (you will need to write JSON transformers to make it work, refer to the docs for more)
Most of the time, the copilot can figure out what actions to execute when the user requests something, but in case there is a complex flow, you can define it to help the copilot:
Less than <10 lines of codes to implement on your web app or desktop app
- The backend server (API) is reachable via http://localhost:8888/backend
- The dashboard server is reachable via http://localhost:8888/
- You can also use our SDK
This project follows the all-contributors specification. Contributions of any kind are welcome!
- Learn how OpenCopilot codebase works and how you can contribute using Onbaord AI's tool: learnthisrepo.com/opencopilot
- This project follows the all-contributors specification. Contributions of any kind are welcome!
This product collects anonymous usage data to help improve your experience. You can opt out by setting ENABLE_EXTERNAL_API_LOGGING=no
in your environment variables.