Skip to content

mastra-ai/mastra

Repository files navigation

Mastra Framework

Mastra framework homepage

Mastra lets you prototype and productionize AI features quickly with a modern JS/TS stack.

  • Workflows: Chain calls to OpenAI, Anthropic, Google. Pipe output between steps. Create graphs with workflow.addStep() and step.connect().
  • RAG pipeline: Sync data into a vector DB (Pinecone). Mastra integrates with 50+ SaaS services, web scrapers, etc.
  • Agents: provide OpenAI Assistants with tools, workflows, synced data.

Mastra uses Inngest and Prisma to store and sync data into vector databases. It includes an admin panel for exploring data, a playground for testing actions, a chat UI for agents, and a visual workflow builder with variables, conditions, and branching.

The setup is completely self- contained and runs on your local machine. In production, you can self-host or deploy to Vercel/Netlify.

Quick Start

Prerequisites

  • Node.js (version 20 or later)
  • pnpm (version 9.7.0 or later)
  • Docker (the daemon should be running)

Installation

  1. Install the Mastra CLI:
$ npm install -g mastra
  1. Initialize your project:
$ mastra init
  1. Provision local resources:
$ Enter your PostgreSQL connection string (postgresql://username:password@host:port/database) or press Enter to create a new instance:
$ Enter your Inngest server URL or press Enter to create a new instance:

Configuration

After initialization, you'll find an mastra.config.ts file in your project root. You can find the full list of configuration options in the Mastra config docs.

Deployment

Mastra's data syncing infrastructure is designed for Next.js sites running on serverless hosting providers like Vercel or Netlify.

Job queues are managed with Inngest, which can be self-hosted or run as a managed service.

Logs are stored in Upstash.

Full deployment docs here.