A simple lightweight chatbot that can be locally deployed.
- ✅ Chat with LLM using APIs.
- ✅ Chat with LLM using Locally deployed Ollama serve.
- ✅ Lightweight backend && frontend.
To install and run the localai on your PC/Server by following these steps:
- Ensure the followings are installed: NodeJs (v18+), Yarn, Python(3.9.0+), Ollama(recommended, not required)
- Clone this repository
- Install requirements: cd to the repo directory and run
pip install -r backend/requirements.txt
andcd frontend && yarn add all
- Start server: run `make bootstrap' and enjoy!