Skip to content

Latest commit

 

History

History
46 lines (31 loc) · 923 Bytes

README.md

File metadata and controls

46 lines (31 loc) · 923 Bytes

Setup and Run locally

FastAPI + OpenAI + Websocket - chat example

Build & run the docker-compose

  • first of all, check if you already copied .env from .env.dist file
cp .env.dist .env
  • update your openAI api secret key in it
  • build and start docker-compose
docker-compose up -d --build
  • To check if server is running go to:

http://0.0.0.0:8090/

  • To check if frontend is running go to:

http://0.0.0.0:3002/


How to teach ChatGPT to use your data for answers

  • put your files to directory ./server/docs
  • run this command:
docker exec -it chat_api bash -c "python chat_model.py"
  • restart chat_api container

TESTS

docker exec -it chat_api bash -c "python -m pytest"
docker exec -it chat_api bash -c "mypy src tests"
docker exec -it chat_api bash -c "flake8 src tests"