- Frontend: React (with TypeScript)
- Backend: FastAPI (Python)
- Deployment: Docker & Docker Compose
In this project, we build a chatbot application which takes a document file (pdf, txt, docx, csv) as an input and answers user's query. The goal of this application is to accurately provide answers based on the uploaded file. This application could be used as an assistant to quickly answer questions or summarize facts from files containing large amounts of text data, making our lives easier.
In this project you find 2 directories
backend
containing the server side python codefrontend
containing the client side typescript code.
Requirements:
- Python 3.10 or above.
- OpenAI API key
main.py
which is the entry point to our server- This project has a few Python packages as dependencies, you can install them in your virtual environment using
requirements.txt
. - We will be using
conda
package manager to create a virtual environmentchatbot
usingconda create -n chatbot python=3.10
and thenconda activate chatbot
to activate the environment. - Then install the python packages using
pip install -r requirements.txt
Important: Make sure to rename the .env.template
file in backend/app
directory to .env
and add your openai api key for error free deployment.
To launch the server, navigate to the backend
directory and run:
This will start the server at http://127.0.0.1:8000/
The project structure within the frontend
directory follows the official create-react-app
structure as in the docs.
Requirements: We are using node V20.11.1
and npm 10.2.4
. They can be downloaded via installer. For more information check here
-
Navigate to the
frontend
directory and runnpm install
-
Then you can run:
This will launch the app in development mode.
Open http://localhost:3000 to view it in the browser.
The page will reload if you make edits. You will also see any lint errors in the console.
- Run the command
docker-compose up -d
to start the frontend and backend containers. - Open http://127.0.0.1:8000/docs to access the Swagger UI
- Open http://localhost:3000 to access the frontend UI
Click on the image below to see the demo 👇
- Handle edge cases when the user has uploads a very large file (>100mb) or a unsupported file type like video/mp3.