This demo has been built following the official LlamaIndex Starter Tutorial (Local Models). It also implements index persistence from Starter Tutorial (OpenAI).
Model tested:
- Embedding model: text-embedding-3-small
- Generative model: GPT-3.5-turbo
- Create a virtual environment and activate it:
python3 -m venv .venv
source .venv/bin/activate
- Install the requirements:
pip install -r requirements.txt
- Set up the OpenAI key in a environment variable:
export OPENAI_API_KEY=<your_openai_api_key>
- Launch the app:
python3 src/demo.py