- About the Chain Server
- Running the Chain Server Independently
- Supporting Additional Document File Types
- Chain Server REST API Reference
The chain server is implemented as a sample FastAPI-based server so that you can experience a Q&A chat bot. The server wraps calls made to different components and orchestrates the entire flow for all the generative AI examples.
To run the server for development purposes, run the following commands:
-
Build the container from source:
cd RAG/examples/advanced_rag/multi_turn_rag docker compose build chain-server
-
Start the container, which starts the server:
docker compose up -d chain-server
-
Open a browser to http://host-ip:8081/docs to view the REST API try the exposed endpoints.
Most of the examples support reading text files, Markdown files, and PDF files. The multimodal example supports PDF, PowerPoint, and PNG files.
As a simple example, consider the following steps that show how to add support for ingesting Jupyter Notebooks with LangChain.
-
Optional: Edit the
RAG/src/chain-server/requirements.txt
file to add document loader packages.In this case, the basic LangChain example already includes the
langchain_community
package, so no edit is necessary. -
Edit the
RAG/examples/basic_rag/langchain/chains.py
file and make the following edits.-
Import the notebook document loader:
from langchain_community.document_loaders import NotebookLoader
-
Update the
ingest_docs
function and make changes like the following example:if not filename.endswith((".txt", ".pdf", ".md", ".ipynb")): raise ValueError(f"{filename} is not a valid Text, PDF, Markdown, or Jupyter Notebook file") try: # Load raw documents from the directory _path = filepath if filename.endswith(".ipynb"): raw_documents = NotebookLoader(_path, include_outputs=True).load() else: raw_documents = UnstructuredFileLoader(_path).load()
-
-
Build and start the containers:
docker compose up -d --build
After the containers start, ingest a Juypter Notebook to the knowledge base and then query the LLM about the notebook.
You can view the server REST API schema from the chain server by accessing http://host-ip:8081/docs.
Alternatively, you can view the OpenAPI specification from the openapi_schema.json file.