- Folllow the instructions in the model-server repository to setup the model server.
- Clone this repository.
git clone https://github.com/snipaid-nlg/demo.git
- Change directory to project directory.
cd demo
- Create a file with the name ".env".
- Insert the following content into the ".env" file. Replace Your-Personal-Banana-Api-Key, Your-Personal-Banana-Model-Key and Your-Strong-Secret-Key-For-Django with your own secret keys.
# Secret key for Django (at least 6 characters)
SECRET_KEY=Your-Strong-Secret-Key-For-Django
# API Key for Banana (copy from Banana)
BANANA_API_KEY=Your-Personal-Banana-Api-Key
# API Key for Model (copy from Banana)
BANANA_MODEL_KEY=Your-Personal-Banana-Model-Key
- Build and run with docker.
docker compose up --build
- Visit http://localhost:8000 to see if SnipAId is running.
- Copy a text and generate some titles and teasers!
SnipAId is build on GPT-J, an Open Source Model from the GPT family. The core functionality of GPT models is taking a string of text and predicting the next token. When generating text with SnipAId please keep in mind, that the statistically most likely next token is often not the token that produces the most "accurate" text. Never depend upon those models to produce factually accurate output!
We recommend having a human curate or filter the outputs before releasing them,
both to censor undesirable content and to improve the quality of the results.
See also limitations and biases of GPT-J.