codesherpa is a code interpreter ChatGPT plugin and a standalone code interpreter (experimental). Read the Quickstart section to try it out.
- code interpreter plugin with ChatGPT
- API for ChatGPT to run and execute code with file persistance and no timeout
- standalone code interpreter (experimental).
- Using OpenAI's GPT function calling, I've tried to recreate the experience of the ChatGPT Code Interpreter by using
functions
. For many reasons, there is a significant difference between this implementation and the ChatGPT Code Interpreter created by OpenAI. It's still very buggy and inconsistent, but I wanted to release it for those interested.
- Using OpenAI's GPT function calling, I've tried to recreate the experience of the ChatGPT Code Interpreter by using
Standalone code interpreter demo:
CS-standalone.mp4
ChatGPT Plugin demo:
codesherpa.ChatGPT.Code.Interpreter.prompts.-.music.csv.demo-1-highlight.mp4
See more examples here
- July 13, 2023:
- A basic standalone UI is now available. Read the Quickstart section to try it (requires an OpenAI API key). NOTE: expect many bugs and shortcomings, especially if you've been binging OpenAI's Code Interpreter since its been made generally available to Plus subscribers
- New contributions from emsi (#18) and PeterDaveHello (#28)! 👏
- June 21, 2023:
- The ChatGPT plugin service will now fetch the
openapi.json
generated be the server. Also added request example data which is included in the api spec. This reduces the size of the plugin manifestdescription_for_model
. - Updated the README section on future work.
- The ChatGPT plugin service will now fetch the
Previous Updates
- June 18, 2023: Added
docker-compose.yml
- May 31, 2023: Introduced new file upload interface via
upload.html
and corresponding server endpoint, allowing you to upload files atlocalhost:3333/upload
or by telling ChatGPT you want to upload a file or have a file you want to work with: Refactored Python code execution usingast
module for enhanced efficiency. Local server and manifest file updates to support these features. Minor updates to REPL execution, error handling, and code formatting. - May 22, 2023: Refactored README to provide clear and concise instructions for building and running codesherpa.
- May 20, 2023: codesherpa now supports multiple programming languages, including Python, C++, and Rust.
To try the new chat interface:
# Clone the repository
git clone https://github.com/iamgreggarcia/codesherpa.git
Add your OPENAI_API_KEY
to a copy of .env.example
:
cd codesherpa/frontend
cp .env.example .env.local
Install dependencies and startup the Next.js app:
pnpm install
pnpm dev
OR
npm install
npm run dev
Download the docker image OR run the codesherpa API locally (beware!):
Docker image:
# Pull the Docker image
docker pull ghcr.io/iamgreggarcia/codesherpa:latest
# Run the Docker image locally
docker compose up
Run the server locally (potentially risky!):
cd codesherpa
make dev
Navigate to http://localhost:3000
. Expect bugs and inconsistencies.
Ensure the following software is installed on your system:
- Python 3.10
- Docker
- Docker Compose (optional). Download Docker Desktop or the plugin to use Docker Compose
Option 1: Using Docker image from Github Packages
# Pull the Docker image
docker pull ghcr.io/iamgreggarcia/codesherpa:latest
# Run the Docker image locally
docker compose up
Option 2: Using the repository and Make commands
# Clone the repository
git clone https://github.com/iamgreggarcia/codesherpa.git
# Navigate to the repository directory
cd codesherpa
# Build the Docker image using Make
make build-docker
# Run the Docker image locally
make run-docker-localserver
Option 3: Using the repository and Docker commands
Instead of Make commands, you can use the following Docker commands directly or use Docker Compose
# Clone the repository
git clone https://github.com/iamgreggarcia/codesherpa.git
# Navigate to the repository directory
cd codesherpa
# Build the Docker image
docker build -t codesherpa .
# Run the Docker image locally
docker run -p 3333:3333 codesherpa python3 -c "import localserver.main; localserver.main.start()"
# OR use Docker Compose
docker compose up
Whichever option you choose, codesherpa will be accessible at localhost:3333.
- Navigate to the ChatGPT UI, and access the plugin store.
- Select "Develop your own plugin".
- In the plugin URL input, enter
localhost:3333
. Your ChatGPT should now be able to use codesherpa's features.
Below are some examples. Please note that portions of these videos are edited and/or sped up for brevity.
Vector.field.on.a.sphere.animation.mp4
Most of us have seen the ChatGPT Code Interpreter Video Demo, which is the inspiration for this project. So I thought it fitting to ask similar questions as those in the OpenAI video demo.
- Asking about properties of the function
1/sin(x)
:
codesherpa.ChatGPT.Code.Interpreter.prompts.-.demo.mp4
- Uploading a music.csv dataset for analysis and visualization. View the ChatGPT conversation
codesherpa.ChatGPT.Code.Interpreter.prompts.-.music.csv.demo-1.mp4
- Improve the UI (maybe). The UI needs a lot of love:
- multi-conversation support
- previous message editing
- granular control over parameters
- consistent function call rendering
I welcome contributions! If you have an idea for a feature, or want to report a bug, please open an issue, or submit a pull request.
Steps to contribute:
- Fork this repository.
- Create a feature branch
git checkout -b feature/YourAmazingIdea
. - Commit your changes
git commit -m 'Add YourAmazingIdea'
. - Push to the branch
git push origin feature/YourAmazingIdea
. - Submit a Pull Request.
codesherpa is independently developed and not affiliated, endorsed, or sponsored by OpenAI.
This project is licensed under the terms of the MIT license.