An open-source AI-powered search engine in the field of research data management based on farfalle (a Perplexity analogue).
- 🛠️ Tech Stack
- 🛠️ Features
- 🏃🏿♂️ Getting Started
- 🚀 Deploy
- Frontend: Next.js
- Backend: FastAPI
- Search API: SearXNG, Tavily, Serper, Bing
- Logging: Logfire
- Rate Limiting: Redis
- Components: shadcn/ui
- Search with multiple search providers (Tavily, Searxng, Serper, Bing)
- Answer questions with cloud models (OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3)
- Answer questions with local models (llama3, mistral, gemma, phi3)
- Answer questions with any custom LLMs through LiteLLM
- Search with an agent that plans and executes the search for better results
- Docker
- Ollama (If running local models)
- Download any of the supported models: llama3, mistral, gemma, phi3
- Start ollama server
ollama serve
git clone https://github.com/UB-Mannheim/FAIR-farfalle.git
cd FAIR-farfalle && cp .env-template .env
Modify .env with your API keys (Optional, not required if using Ollama)
Start the app:
docker-compose up -d
Wait for the app to start then visit http://localhost:3000.
For custom setup instructions, see custom-setup-instructions.md