Skip to content

zhangsikai123/localai

Repository files navigation

LocalAi

Overview

A simple lightweight chatbot that can be locally deployed.

Features

  • ✅ Chat with LLM using APIs.
  • ✅ Chat with LLM using Locally deployed Ollama serve.
  • ✅ Lightweight backend && frontend.

Quick Start

Your PC/Server

To install and run the localai on your PC/Server by following these steps:

  1. Ensure the followings are installed: NodeJs (v18+), Yarn, Python(3.9.0+), Ollama(recommended, not required)
  2. Clone this repository
  3. Install requirements: cd to the repo directory and run pip install -r backend/requirements.txt and cd frontend && yarn add all
  4. Start server: run `make bootstrap' and enjoy!

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published