Skip to content

A seamless integration of Chainlit with LangGraph. Ideal for fast prototyping multi-LLM chatbot or AI agents.

License

Notifications You must be signed in to change notification settings

brucechou1983/chainlit_langgraph

Repository files navigation

Version Chainlit LangGraph License

"Buy Me A Coffee"

Rapidly build and deploy production-ready conversational AI agents using Chainlit and LangGraph. This powerful integration combines state-of-the-art language models with flexible workflow management, enabling developers to create sophisticated chatbots, virtual assistants, and interactive AI applications in minutes.

Demo

Table of Contents

Why This Project?

Chainlit is a powerful tool for building production-ready conversational AI applications. LangGraph, on the other hand, is a versatile framework for building and managing state graphs in AI applications. This project combines these two to provide a comprehensive solution for building conversational AI agents, in minutes.

Features

  • Building Blocks: Utilize a variety of building blocks to create your own conversational AI agents.
  • Multiple LLM Support: Automatically detects and uses the following LLMs:
  • Examples: Explore a variety of use cases with conversational AI agents examples.

Getting Started

Follow these steps to set up and run the project using Docker Compose or in your Python 3.10 virtual environment.

  1. Make sure you have Docker and Docker Compose installed on your system.
  2. Clone this repository and navigate to the project directory.
  3. Copy the .env.example file to .env and update the necessary environment variables:
cp .env.example .env
  1. Edit the .env file and set the required variables, including:
  • API keys (OPENAI_API_KEY, ANTHROPIC_API_KEY): Optional if you use Ollama.
  • DB volume settings (POSTGRES_VOLUME_PATH, MINIO_VOLUME_PATH): create mount folders on your host machine and set the paths accordingly.
  • (Optional) TAVILY_API_KEY for enabling search
  • (Optional) Google OAuth
  • (Optional) LangSmith
  1. Start the services using Docker Compose
docker compose up

This will start all the necessary services, including the Chainlit application, PostgreSQL database, and MinIO object storage.

  1. The application should now be running at http://localhost:8000. Log in with the default username and password (admin:admin). You can change the default credentials in the .env file.

Setting up Ollama (Optional)

  1. Download and install Ollama.
  2. Pull whatever model you want to use, for example:
ollama pull cas/ministral-8b-instruct-2410_q4km:latest
ollama pull llama3.2:3b-instruct-q8_0

of any gguf-based model on the HuggingFace.

ollama run hf.co/{username}/{repository}:{quantization}

Creating Custom Workflow

Creating your own custom workflow allows you to tailor the application to your specific needs. Follow the step-by-step guide below to create your own workflow.

  1. Go to the chat_workflow/workflows directory in your project, and create a new Python file for your workflow, e.g., my_custom_workflow.py.
  2. Define Your State Class
  • Inherit from BaseState to define the state variables your workflow will use. For example:
class MyCustomState(BaseState):
  # Model name of the chatbot
  chat_model: str
  # Add other state variables as needed
  1. Define Your Workflow
  • Inherit from BaseWorkflow to define your custom workflow logic, and override the create_graph method to define the state graph.
class MyCustomWorkflow(BaseWorkflow):
  def create_graph(self) -> StateGraph:
      # LangGraph graph definition
      graph = StateGraph(MyCustomState)
      # Add nodes to the graph
      graph.add_node("chat", self.chat_node)
      # Add edges between nodes
      graph.add_edge("chat", END)
      # Set the entry point of the graph
      graph.set_entry_point("chat")
      return graph
  • Define node methods like self.chat_node in the create_graph method.
  • Define default state by overriding the get_default_state method.
def create_default_state(self) -> MyCustomState:
  return {
      "name": self.name(),
      "messages": [],
      "chat_model": "",
      # Initialize other state variables if needed
  }
  • Set workflow properties.
    • name: The display name of the workflow. For example, "My Custom Workflow".
    • output_chat_model: The name of the LLM model to provide final output as a response.
    • chat_profile: The profile for the workflow.
    • starter: The starter message for the workflow.

Workflows

This project includes several pre-built workflows to demonstrate the capabilities of the Chainlit Langgraph integration:

Located in simple_chat.py, this workflow provides a basic chatbot experience:

  • Utilizes a state graph with chat and tool nodes
  • Supports multiple language models
  • Includes basic tools like datetime and web search
  • Supports images and text inputs

Found in resume_optimizer.py, this workflow helps users improve their resumes:

  • Features a resume extractor node to process uploaded PDF resumes
  • Provides detailed analysis and suggestions for resume improvement

Implemented in lean_canvas_chat.py, this workflow assists in business modeling:

  • Guides users through the Lean Canvas creation process
  • Offers a structured approach to defining business models

Each workflow demonstrates different aspects of the Chainlit Langgraph integration, showcasing its flexibility and power in creating AI-driven applications.

Upcoming Features

  • Research Assistant: A research assistant that can help users with their general research tasks, like NotebookLM.
  • NVIDIA NIM: Self-host GPU-accelerated inferencing microservices for pretrained and customized AI models across clouds, data centers, and workstations.
  • Cloud Deployment: Easy deployment of the application to cloud platforms like AWS, Azure, or GCP.
  • Graph Builder: A meta-workflow builder that allows users to create custom workflows with natural language.
  • OpenAI o1-like agentic workflow: Advanced self-prompting agentic workflow.
  • Image Generation: Generate images based on user input.

About

A seamless integration of Chainlit with LangGraph. Ideal for fast prototyping multi-LLM chatbot or AI agents.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

Packages

No packages published

Languages