Skip to main content
Open In ColabOpen on GitHub

如何使用BaseChatMessageHistory与LangGraph

Prerequisites

本指南假设您熟悉以下概念:

我们建议新的LangChain应用程序利用内置的LangGraph持久性来实现内存。

在某些情况下,用户可能需要继续使用现有的持久化解决方案来保存聊天消息历史。

在这里,我们将展示如何使用LangChain聊天消息历史BaseChatMessageHistory的实现)与LangGraph。

设置

%%capture --no-stderr
%pip install --upgrade --quiet langchain-anthropic langgraph
import os
from getpass import getpass

if "ANTHROPIC_API_KEY" not in os.environ:
os.environ["ANTHROPIC_API_KEY"] = getpass()

聊天消息历史

消息历史需要通过对话ID或可能是(用户ID,对话ID)的二元组进行参数化。

许多LangChain聊天消息历史将具有session_id或某些namespace,以允许跟踪不同的对话。请参考具体实现以检查其参数化方式。

内置的InMemoryChatMessageHistory不包含这样的参数化,因此我们将创建一个字典来跟踪消息历史。

import uuid

from langchain_core.chat_history import InMemoryChatMessageHistory

chats_by_session_id = {}


def get_chat_history(session_id: str) -> InMemoryChatMessageHistory:
chat_history = chats_by_session_id.get(session_id)
if chat_history is None:
chat_history = InMemoryChatMessageHistory()
chats_by_session_id[session_id] = chat_history
return chat_history

与LangGraph一起使用

接下来,我们将使用LangGraph设置一个基本的聊天机器人。如果你不熟悉LangGraph,你应该查看以下快速入门教程

我们将为聊天模型创建一个LangGraph节点,并手动管理对话历史,考虑到作为RunnableConfig一部分传递的对话ID。

会话ID可以作为RunnableConfig的一部分传递(就像我们在这里所做的那样),或者作为图状态的一部分传递。

import uuid

from langchain_anthropic import ChatAnthropic
from langchain_core.messages import BaseMessage, HumanMessage
from langchain_core.runnables import RunnableConfig
from langgraph.graph import START, MessagesState, StateGraph

# Define a new graph
builder = StateGraph(state_schema=MessagesState)

# Define a chat model
model = ChatAnthropic(model="claude-3-haiku-20240307")


# Define the function that calls the model
def call_model(state: MessagesState, config: RunnableConfig) -> list[BaseMessage]:
# Make sure that config is populated with the session id
if "configurable" not in config or "session_id" not in config["configurable"]:
raise ValueError(
"Make sure that the config includes the following information: {'configurable': {'session_id': 'some_value'}}"
)
# Fetch the history of messages and append to it any new messages.
chat_history = get_chat_history(config["configurable"]["session_id"])
messages = list(chat_history.messages) + state["messages"]
ai_message = model.invoke(messages)
# Finally, update the chat message history to include
# the new input message from the user together with the
# repsonse from the model.
chat_history.add_messages(state["messages"] + [ai_message])
return {"messages": ai_message}


# Define the two nodes we will cycle between
builder.add_edge(START, "model")
builder.add_node("model", call_model)

graph = builder.compile()

# Here, we'll create a unique session ID to identify the conversation
session_id = uuid.uuid4()
config = {"configurable": {"session_id": session_id}}

input_message = HumanMessage(content="hi! I'm bob")
for event in graph.stream({"messages": [input_message]}, config, stream_mode="values"):
event["messages"][-1].pretty_print()

# Here, let's confirm that the AI remembers our name!
input_message = HumanMessage(content="what was my name?")
for event in graph.stream({"messages": [input_message]}, config, stream_mode="values"):
event["messages"][-1].pretty_print()
================================ Human Message =================================

hi! I'm bob
================================== Ai Message ==================================

Hello Bob! It's nice to meet you. I'm Claude, an AI assistant created by Anthropic. How are you doing today?
================================ Human Message =================================

what was my name?
================================== Ai Message ==================================

You introduced yourself as Bob when you said "hi! I'm bob".

如果使用 langgraph >= 0.2.28,这也支持逐令牌流式传输 LLM 内容。

from langchain_core.messages import AIMessageChunk

first = True

for msg, metadata in graph.stream(
{"messages": input_message}, config, stream_mode="messages"
):
if msg.content and not isinstance(msg, HumanMessage):
print(msg.content, end="|", flush=True)
API Reference:AIMessageChunk
You| sai|d your| name was Bob.|

使用 RunnableWithMessageHistory

本操作指南直接使用了BaseChatMessageHistorymessagesadd_messages接口。

或者,您可以使用RunnableWithMessageHistory,因为LCEL可以在任何LangGraph节点内部使用。

要这样做,请替换以下代码:

def call_model(state: MessagesState, config: RunnableConfig) -> list[BaseMessage]:
# Make sure that config is populated with the session id
if "configurable" not in config or "session_id" not in config["configurable"]:
raise ValueError(
"You make sure that the config includes the following information: {'configurable': {'session_id': 'some_value'}}"
)
# Fetch the history of messages and append to it any new messages.
chat_history = get_chat_history(config["configurable"]["session_id"])
messages = list(chat_history.messages) + state["messages"]
ai_message = model.invoke(messages)
# Finally, update the chat message history to include
# the new input message from the user together with the
# repsonse from the model.
chat_history.add_messages(state["messages"] + [ai_message])
# hilight-end
return {"messages": ai_message}

在当前应用程序中定义了相应的RunnableWithMessageHistory实例。

runnable = RunnableWithMessageHistory(...) # From existing code

def call_model(state: MessagesState, config: RunnableConfig) -> list[BaseMessage]:
# RunnableWithMessageHistory takes care of reading the message history
# and updating it with the new human message and ai response.
ai_message = runnable.invoke(state['messages'], config)
return {
"messages": ai_message
}

这个页面有帮助吗?