Skip to main content

人类学克劳德

在 Colab 中打开 在 GitHub 上打开

在这个笔记本中,我们演示了如何使用人类学克劳德模型进行AgentChat。

需求

要使用AutoGen的人类学克劳德模型,首先需要安装pyautogenanthropic包。

要尝试使用克劳德模型的函数调用功能,您需要安装anthropic>=0.23.1

# !pip install pyautogen
!pip install "anthropic>=0.23.1"
import inspect
import json
from typing import Any, Dict, List, Union

from anthropic import Anthropic
from anthropic import __version__ as anthropic_version
from anthropic.types import Completion, Message
from openai.types.chat.chat_completion import ChatCompletionMessage
from typing_extensions import Annotated

import autogen
from autogen import AssistantAgent, UserProxyAgent

TOOL_ENABLED = anthropic_version >= "0.23.1"
if TOOL_ENABLED:
from anthropic.types.beta.tools import ToolsBetaMessage
else:
ToolsBetaMessage = object

创建遵循ModelClient协议的人类学模型客户端

我们将实现一个遵循ModelClient协议和响应结构的人类学客户端,该协议在client.py中定义并显示如下。

class ModelClient(Protocol):
"""
客户端类必须实现以下方法:
- create 必须返回一个实现了 ModelClientResponseProtocol 的响应对象
- cost 必须返回响应的成本
- get_usage 必须返回一个包含以下键的字典:
- prompt_tokens
- completion_tokens
- total_tokens
- cost
- model

此类用于创建一个可以被 OpenAIWrapper 使用的客户端。
从 create 返回的响应必须遵循 ModelClientResponseProtocol,但可以根据需要进行扩展。
必须实现 message_retrieval 方法以返回响应中的字符串列表或消息列表。
"""

RESPONSE_USAGE_KEYS = ["prompt_tokens", "completion_tokens", "total_tokens", "cost", "model"]

class ModelClientResponseProtocol(Protocol):
class Choice(Protocol):
class Message(Protocol):
content: Optional[str]

message: Message

choices: List[Choice]
model: str

def create(self, params) -> ModelClientResponseProtocol:
...

def message_retrieval(
self, response: ModelClientResponseProtocol
) -> Union[List[str], List[ModelClient.ModelClientResponseProtocol.Choice.Message]]:
"""
检索并返回响应中的字符串列表或 Choice.Message 列表。

注意:如果返回 Choice.Message 列表,目前需要包含 OpenAI 的 ChatCompletion Message 对象的字段,
因为目前代码库中的函数或工具调用都期望这样,除非使用自定义代理。
"""
...

def cost(self, response: ModelClientResponseProtocol) -> float:
...

@staticmethod
def get_usage(response: ModelClientResponseProtocol) -> Dict:
"""使用 RESPONSE_USAGE_KEYS 返回响应的使用情况摘要。"""
...

AnthropicClient 的实现

你可以在这里找到关于 Claude-3-Opus 模型的介绍。

由于 anthropic 提供的 Python SDK 结构与 OpenAI 的类似,我们将按照 autogen.oai.client.OpenAIClient 的实现进行操作。

class AnthropicClient:
def __init__(self, config: Dict[str, Any]):
self._config = config
self.model = config["model"]
anthropic_kwargs = set(inspect.getfullargspec(Anthropic.__init__).kwonlyargs)
filter_dict = {k: v for k, v in config.items() if k in anthropic_kwargs}
self._client = Anthropic(**filter_dict)

self._last_tooluse_status = {}

def message_retrieval(
self, response: Union[Message, ToolsBetaMessage]
) -> Union[List[str], List[ChatCompletionMessage]]:
"""Retrieve the messages from the response."""
messages = response.content
if len(messages) == 0:
return [None]
res = []
if TOOL_ENABLED:
for choice in messages:
if choice.type == "tool_use":
res.insert(0, self.response_to_openai_message(choice))
self._last_tooluse_status["tool_use"] = choice.model_dump()
else:
res.append(choice.text)
self._last_tooluse_status["think"] = choice.text

return res

else:
return [ # type: ignore [return-value]
choice.text if choice.message.function_call is not None else choice.message.content # type: ignore [union-attr]
for choice in messages
]

def create(self, params: Dict[str, Any]) -> Completion:
"""Create a completion for a given config.

Args:
params: The params for the completion.

Returns:
The completion.
"""
if "tools" in params:
converted_functions = self.convert_tools_to_functions(params["tools"])
params["functions"] = params.get("functions", []) + converted_functions

raw_contents = params["messages"]
processed_messages = []
for message in raw_contents:

if message["role"] == "system":
params["system"] = message["content"]
elif message["role"] == "function":
processed_messages.append(self.return_function_call_result(message["content"]))
elif "function_call" in message:
processed_messages.append(self.restore_last_tooluse_status())
elif message["content"] == "":
# I'm not sure how to elegantly terminate the conversation, please give me some advice about this.
message["content"] = "I'm done. Please send TERMINATE"
processed_messages.append(message)
else:
processed_messages.append(message)

params["messages"] = processed_messages

if TOOL_ENABLED and "functions" in params:
completions: Completion = self._client.beta.tools.messages
else:
completions: Completion = self._client.messages # type: ignore [attr-defined]

# Not yet support stream
params = params.copy()
params["stream"] = False
params.pop("model_client_cls")
params["max_tokens"] = params.get("max_tokens", 4096)
if "functions" in params:
tools_configs = params.pop("functions")
tools_configs = [self.openai_func_to_anthropic(tool) for tool in tools_configs]
params["tools"] = tools_configs
response = completions.create(**params)

return response

def cost(self, response: Completion) -> float:
"""Calculate the cost of the response."""
total = 0.0
tokens = {
"input": response.usage.input_tokens if response.usage is not None else 0,
"output": response.usage.output_tokens if response.usage is not None else 0,
}
price_per_million = {
"input": 15,
"output": 75,
}
for key, value in tokens.items():
total += value * price_per_million[key] / 1_000_000

return total

def response_to_openai_message(self, response) -> ChatCompletionMessage:
dict_response = response.model_dump()
return ChatCompletionMessage(
content=None,
role="assistant",
function_call={"name": dict_response["name"], "arguments": json.dumps(dict_response["input"])},
)

def restore_last_tooluse_status(self) -> Dict:
cached_content = []
if "think" in self._last_tooluse_status:
cached_content.append({"type": "text", "text": self._last_tooluse_status["think"]})
cached_content.append(self._last_tooluse_status["tool_use"])
res = {"role": "assistant", "content": cached_content}
return res

def return_function_call_result(self, result: str) -> Dict:
return {
"role": "user",
"content": [
{
"type": "tool_result",
"tool_use_id": self._last_tooluse_status["tool_use"]["id"],
"content": result,
}
],
}

@staticmethod
def openai_func_to_anthropic(openai_func: dict) -> dict:
res = openai_func.copy()
res["input_schema"] = res.pop("parameters")
return res

@staticmethod
def get_usage(response: Completion) -> Dict:
return {
"prompt_tokens": response.usage.input_tokens if response.usage is not None else 0,
"completion_tokens": response.usage.output_tokens if response.usage is not None else 0,
"total_tokens": (
response.usage.input_tokens + response.usage.output_tokens if response.usage is not None else 0
),
"cost": response.cost if hasattr(response, "cost") else 0,
"model": response.model,
}

@staticmethod
def convert_tools_to_functions(tools: List) -> List:
functions = []
for tool in tools:
if tool.get("type") == "function" and "function" in tool:
functions.append(tool["function"])

return functions

设置Anthropic API的配置

您可以在同一配置列表中添加任何需要的参数,用于自定义模型加载。

重要的是要添加model_client_cls字段,并将其设置为对应的类名字符串:"CustomModelClient"。

import os

config_list_claude = [
{
# 选择您的模型名称。
"model": "claude-3-sonnet-20240229",
# 在此处提供您的API密钥。
"api_key": os.getenv("ANTHROPIC_API_KEY"),
"base_url": "https://api.anthropic.com",
"api_type": "anthropic",
"model_client_cls": "AnthropicClient",
}
]

构建代理

基于Claude-3模型,构建一个简单的用户代理和可对话代理之间的对话。

assistant = AssistantAgent(
"assistant",
llm_config={
"config_list": config_list_claude,
},
)

user_proxy = UserProxyAgent(
"user_proxy",
human_input_mode="NEVER",
code_execution_config={
"work_dir": "coding",
"use_docker": False,
},
is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
max_consecutive_auto_reply=1,
)
[autogen.oai.client: 04-08 22:15:59] {419} INFO - 在配置中检测到自定义模型客户端:AnthropicClient,直到调用register_model_client之前,无法使用模型客户端。

最新Anthropic API中的函数调用

Anthropic刚刚宣布工具使用现在在Anthropic API的公共测试版中可用。要使用此功能,请安装anthropic>=0.23.1

@user_proxy.register_for_execution()
@assistant.register_for_llm(name="get_weather", description="获取给定位置的当前天气。")
def preprocess(location: Annotated[str, "城市和州,例如多伦多,安大略省。"]) -> str:
return "绝对多云和多雨"
[autogen.oai.client: 04-08 22:15:59] {419} INFO - 在配置中检测到自定义模型客户端:AnthropicClient,直到调用register_model_client之前,无法使用模型客户端。

将自定义客户端类注册到助理代理

assistant.register_model_client(model_client_cls=AnthropicClient)
user_proxy.initiate_chat(
assistant,
message="多伦多的天气如何?",
)
用户代理(对助手):

多伦多的天气如何?

--------------------------------------------------------------------------------
助手(对用户代理):

***** 建议的函数调用:get_weather *****
参数:
{"location": "多伦多,安大略省"}
************************************************

--------------------------------------------------------------------------------

>>>>>>>> 执行函数 get_weather...
用户代理(对助手):

***** 调用函数(get_weather)的响应 *****
绝对多云和下雨
********************************************************

--------------------------------------------------------------------------------
助手(对用户代理):

工具返回,多伦多,安大略省的当前天气是绝对多云和下雨。

--------------------------------------------------------------------------------
ChatResult(chat_id=None, chat_history=[{'content': "多伦多的天气怎么样?", 'role': 'assistant'}, {'function_call': {'arguments': '{"location": "Toronto, ON"}', 'name': 'get_weather'}, 'content': None, 'role': 'assistant'}, {'content': '绝对多云和下雨', 'name': 'get_weather', 'role': 'function'}, {'content': '该工具返回当前多伦多, 安大略的天气是绝对多云和下雨。', 'role': 'user'}], summary='该工具返回当前多伦多, 安大略的天气是绝对多云和下雨。', cost=({'total_cost': 0.030494999999999998, 'claude-3-sonnet-20240229': {'cost': 0.030494999999999998, 'prompt_tokens': 1533, 'completion_tokens': 100, 'total_tokens': 1633}}, {'total_cost': 0}), human_input=[])