Promptmodel is a collaborative prompt & model engineering framework that offers the following:
- Streamlined prompt engineering collaboration for developers and non-developers.
- Web editor designed for prompt engineering with difference visualization & built-in version tracking.
- SDK (Python) to integrate the prompts with your existing codebase.
- Dashboard for product-level evaluation & prompt management (A/B tests coming soon).
Click on the image above to start the interactive demo.
You can explore more demos here.
Managed deployment by the Promptmodel team, generous free-tier (hobby plan) available, no credit card required.
Requirements: docker, docker compose (e.g. using Docker Desktop)
# Clone repository
git clone https://github.com/promptmodel/promptmodel.git
cd promptmodel
# Run server and database
docker compose up -d
Fully async, typed SDKs to instrument any LLM application. Currently available for Python.
Package | Description | Links |
---|---|---|
Python | docs, repo |
The maintainers are very active in the Promptmodel Discord and are happy to answer questions or discuss feedback/ideas regarding the future of the project.
Join the community on Discord.
To contribute, send us a PR, raise a GitHub issue, or email at contributing@promptmodel.run
See CONTRIBUTING.md for details on how to setup a development environment.
Promptmodel is MIT licensed. Enterprise-edition features will be released with a separate license in the future. See LICENSE and for more details.