Integration Issues with LangGraph, RedisChatMessageHistory, and RunnableWithMessageHistory #23582
Replies: 1 comment
-
To integrate 1. Install DependenciesEnsure you have the necessary dependencies installed: pip install -U langchain-community redis langchain-openai 2. Set Up RedisChatMessageHistoryImport and set up from langchain_community.chat_message_histories.redis import RedisChatMessageHistory 3. Create a Chain with HistoryDefine a chain that uses from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_openai import ChatOpenAI
# Define the prompt template
contextualize_q_system_prompt = """Given a chat history and the latest user question \
which might reference context in the chat history, formulate a standalone question \
which can be understood without the chat history. Do NOT answer the question, \
just reformulate it if needed and otherwise return it as is."""
contextualize_q_prompt = ChatPromptTemplate.from_messages(
[
("system", contextualize_q_system_prompt),
MessagesPlaceholder("chat_history"),
("human", "{input}"),
]
)
# Create the history-aware retriever
history_aware_retriever = create_history_aware_retriever(
llm, retriever, contextualize_q_prompt
)
# Define the QA prompt template
qa_system_prompt = """You are an assistant for question-answering tasks. \
Use the following pieces of retrieved context to answer the question. \
If you don't know the answer, just say that you don't know. \
Use three sentences maximum and keep the answer concise.\
{context}"""
qa_prompt = ChatPromptTemplate.from_messages(
[
("system", qa_system_prompt),
MessagesPlaceholder("chat_history"),
("human", "{input}"),
]
)
# Create the question-answer chain
question_answer_chain = create_stuff_documents_chain(llm, qa_prompt)
# Create the retrieval chain
rag_chain = create_retrieval_chain(history_aware_retriever, question_answer_chain)
# Integrate with RunnableWithMessageHistory
chain_with_history = RunnableWithMessageHistory(
rag_chain,
lambda session_id: RedisChatMessageHistory(
session_id, url="redis://localhost:6379"
),
input_messages_key="input",
history_messages_key="chat_history",
)
# Example configuration
config = {"configurable": {"session_id": "foo"}}
# Example invocations
print(chain_with_history.invoke({"input": "Hi! I'm bob"}, config=config))
print(chain_with_history.invoke({"input": "Whats my name"}, config=config)) 4. Using Upstash Redis for Serverless SetupIf you prefer a serverless setup, you can use Upstash Redis. Here’s how to set it up: from langchain_community.chat_message_histories import UpstashRedisChatMessageHistory
URL = "<UPSTASH_REDIS_REST_URL>"
TOKEN = "<UPSTASH_REDIS_REST_TOKEN>"
history = UpstashRedisChatMessageHistory(
url=URL, token=TOKEN, ttl=10, session_id="my-test-session"
)
history.add_user_message("hello llm!")
history.add_ai_message("hello user!") You can then retrieve the messages using: history.messages 5. Integrate with LangGraphLangGraph handles custom memory implementations through a concept called persistence, also known as checkpointing. To integrate external memory systems like Redis, you would add a from langchain_core.messages import SystemMessage
from langgraph.checkpoint import MemorySaver # an in-memory checkpointer
from langgraph.prebuilt import create_react_agent
system_message = "You are a helpful assistant."
# This could also be a SystemMessage object
# system_message = SystemMessage(content="You are a helpful assistant. Respond only in Spanish.")
memory = MemorySaver()
app = create_react_agent(
model, tools, messages_modifier=system_message, checkpointer=memory
)
config = {"configurable": {"thread_id": "test-thread"}}
print(
app.invoke(
{
"messages": [
("user", "Hi, I'm polly! What's the output of magic_function of 3?")
]
},
config,
)["messages"][-1].content
)
print("---")
print(
app.invoke({"messages": [("user", "Remember my name?")]}, config)["messages"][
-1
].content
)
print("---")
print(
app.invoke({"messages": [("user", "what was that output again?")]}, config)[
"messages"
][-1].content
) By following these steps, you can integrate |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
I am currently working on integrating several components into a comprehensive chat application using LangServe and LangChain. Below, I detail the components involved and the specific issues I'm encountering. Any guidance or suggestions would be greatly appreciated.
Components and Setup:
create_retrieval_chain
to form the finalrag_chain
.Message History Implementation:
RedisChatMessageHistory
along withRunnableWithMessageHistory
. The intention is to leverage Redis for managing chat message history, tracking conversations by User ID and Conversation ID.LangGraph Integration:
Issues:
RedisChatMessageHistory
within LangGraph, given that LangGraph primarily supports SQLite for Checkpoints?Resources:
Request:
I am seeking advice or examples on how to properly integrate RedisChatMessageHistory with LangGraph in a manner that maintains full functionality of the message history features. Any insights or pointers towards documentation or similar implementations would be incredibly helpful.
System Info
System Information
Package Information
Beta Was this translation helpful? Give feedback.
All reactions