Skip to main content
Open In ColabOpen on GitHub

ChatAI21

概述

本笔记本介绍了如何开始使用AI21聊天模型。 请注意,不同的聊天模型支持不同的参数。请参阅AI21文档以了解更多关于您选择的模型中的参数。 查看所有AI21的LangChain组件。

集成详情

本地可序列化JS 支持包下载量包最新版本
ChatAI21langchain-ai21测试版PyPI - VersionPyPI - Version

模型特性

工具调用结构化输出JSON模式图像输入音频输入视频输入令牌级流式传输原生异步令牌使用Logprobs

设置

凭证

我们需要获取一个AI21 API密钥并设置AI21_API_KEY环境变量:

import os
from getpass import getpass

if "AI21_API_KEY" not in os.environ:
os.environ["AI21_API_KEY"] = getpass()

如果你想获取模型调用的自动追踪,你也可以通过取消注释以下内容来设置你的LangSmith API密钥:

# os.environ["LANGCHAIN_TRACING_V2"] = "true"
# os.environ["LANGCHAIN_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")

安装

!pip install -qU langchain-ai21

实例化

现在我们可以实例化我们的模型对象并生成聊天完成:

from langchain_ai21 import ChatAI21

llm = ChatAI21(model="jamba-instruct", temperature=0)
API Reference:ChatAI21

调用

messages = [
(
"system",
"You are a helpful assistant that translates English to French. Translate the user sentence.",
),
("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
ai_msg

链式调用

我们可以链式我们的模型与一个提示模板,如下所示:

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human", "{input}"),
]
)

chain = prompt | llm
chain.invoke(
{
"input_language": "English",
"output_language": "German",
"input": "I love programming.",
}
)
API Reference:ChatPromptTemplate

工具调用 / 函数调用

此示例展示了如何使用AI21模型进行工具调用:

import os
from getpass import getpass

from langchain_ai21.chat_models import ChatAI21
from langchain_core.messages import HumanMessage, SystemMessage, ToolMessage
from langchain_core.tools import tool
from langchain_core.utils.function_calling import convert_to_openai_tool

if "AI21_API_KEY" not in os.environ:
os.environ["AI21_API_KEY"] = getpass()


@tool
def get_weather(location: str, date: str) -> str:
"""“Provide the weather for the specified location on the given date.”"""
if location == "New York" and date == "2024-12-05":
return "25 celsius"
elif location == "New York" and date == "2024-12-06":
return "27 celsius"
elif location == "London" and date == "2024-12-05":
return "22 celsius"
return "32 celsius"


llm = ChatAI21(model="jamba-1.5-mini")

llm_with_tools = llm.bind_tools([convert_to_openai_tool(get_weather)])

chat_messages = [
SystemMessage(
content="You are a helpful assistant. You can use the provided tools "
"to assist with various tasks and provide accurate information"
)
]

human_messages = [
HumanMessage(
content="What is the forecast for the weather in New York on December 5, 2024?"
),
HumanMessage(content="And what about the 2024-12-06?"),
HumanMessage(content="OK, thank you."),
HumanMessage(content="What is the expected weather in London on December 5, 2024?"),
]


for human_message in human_messages:
print(f"User: {human_message.content}")
chat_messages.append(human_message)
response = llm_with_tools.invoke(chat_messages)
chat_messages.append(response)
if response.tool_calls:
tool_call = response.tool_calls[0]
if tool_call["name"] == "get_weather":
weather = get_weather.invoke(
{
"location": tool_call["args"]["location"],
"date": tool_call["args"]["date"],
}
)
chat_messages.append(
ToolMessage(content=weather, tool_call_id=tool_call["id"])
)
llm_answer = llm_with_tools.invoke(chat_messages)
print(f"Assistant: {llm_answer.content}")
else:
print(f"Assistant: {response.content}")

API参考

有关所有ChatAI21功能和配置的详细文档,请访问API参考:https://python.langchain.com/api_reference/ai21/chat_models/langchain_ai21.chat_models.ChatAI21.html


这个页面有帮助吗?