Make it possible for anyone to run a simple AI app that can do document Q&A 100% locally without having to swipe a credit card 💳. Based on AI Starter Kit.
Have questions? Join AI Stack devs and find me in #local-ai-stack channel.
- 🦙 Inference: Ollama
- 💻 VectorDB: Supabase pgvector
- 🧠 LLM Orchestration: Langchain.js
- 🖼️ App logic: Next.js
- 🧮 Embeddings generation: Transformer.js and all-MiniLM-L6-v2
Fork the repo to your Github account, then run the following command to clone the repo:
git clone git@github.com:[YOUR_GITHUB_ACCOUNT_NAME]/local-ai-stack.git
cd local-ai-stack
npm install
Instructions are here
- Install Supabase CLI
brew install supabase/tap/supabase
- Start Supabase
Make sure you are under /local-ai-stack
directory and run:
supabase start
cp .env.local.example .env.local
Then get SUPABASE_PRIVATE_KEY
by running
supabase status
Copy service_role key
and save it as SUPABASE_PRIVATE_KEY
in .env.local
node src/scripts/indexBlogLocal.mjs
This script takes in all files from /blogs, generate embeddings using transformers.js, and store embeddings as well as metadata in Supabase.
Now you are ready to test out the app locally! To do this, simply run npm run dev
under the project root and visit http://localhost:3000
.
If you want to take the local-only app to the next level, feel free to follow instructions on AI Starter Kit for using Clerk, Pinecone/Supabase, OpenAI, Replicate and other cloud-based vendors.