单轮多功能调用的OpenAI代理¶
使用最新的OpenAI API(版本1.1.0+),用户现在可以在“用户”和“代理”对话的单个轮次中执行多个函数调用。我们已经更新了我们的库以启用这个新功能,并在本笔记本中将向您展示它是如何工作的!
注意:OpenAI将此称为“并行”函数调用,但当前的实现并不会调用多个函数调用的并行计算。因此,在我们当前的实现中,这是“可并行化”的函数调用。
In [ ]:
Copied!
%pip install llama-index-agent-openai
%pip install llama-index-llms-openai
%pip install llama-index-agent-openai
%pip install llama-index-llms-openai
In [ ]:
Copied!
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI
from llama_index.core.tools import BaseTool, FunctionTool
from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI
from llama_index.core.tools import BaseTool, FunctionTool
设置¶
如果您之前看过我们关于OpenAI Agents的任何笔记本,那么您已经熟悉我们在这里需要遵循的配方。但如果没有,或者如果您想要复习一下,那么我们需要做的(在高层次上)是以下步骤:
- 定义一组工具(我们将使用
FunctionTool
),因为Agents与工具一起工作 - 为Agent定义
LLM
- 定义一个
OpenAIAgent
In [ ]:
Copied!
def multiply(a: int, b: int) -> int: """将两个整数相乘,并返回结果整数""" return a * bmultiply_tool = FunctionTool.from_defaults(fn=multiply)
def multiply(a: int, b: int) -> int: """将两个整数相乘,并返回结果整数""" return a * bmultiply_tool = FunctionTool.from_defaults(fn=multiply)
In [ ]:
Copied!
def add(a: int, b: int) -> int: """对两个整数进行相加,并返回结果整数""" return a + badd_tool = FunctionTool.from_defaults(fn=add)
def add(a: int, b: int) -> int: """对两个整数进行相加,并返回结果整数""" return a + badd_tool = FunctionTool.from_defaults(fn=add)
In [ ]:
Copied!
llm = OpenAI(model="gpt-3.5-turbo-1106")
agent = OpenAIAgent.from_tools(
[multiply_tool, add_tool], llm=llm, verbose=True
)
llm = OpenAI(model="gpt-3.5-turbo-1106")
agent = OpenAIAgent.from_tools(
[multiply_tool, add_tool], llm=llm, verbose=True
)
同步模式¶
In [ ]:
Copied!
response = agent.chat("What is (121 * 3) + 42?")
print(str(response))
response = agent.chat("What is (121 * 3) + 42?")
print(str(response))
STARTING TURN 1 --------------- === Calling Function === Calling function: multiply with args: {"a": 121, "b": 3} Got output: 363 ======================== === Calling Function === Calling function: add with args: {"a": 363, "b": 42} Got output: 405 ======================== STARTING TURN 2 --------------- The result of (121 * 3) + 42 is 405.
In [ ]:
Copied!
response = agent.stream_chat("What is (121 * 3) + 42?")
response = agent.stream_chat("What is (121 * 3) + 42?")
STARTING TURN 1 --------------- === Calling Function === Calling function: add with args: {"a":363,"b":42} Got output: 405 ======================== STARTING TURN 2 ---------------
异步模式¶
In [ ]:
Copied!
import nest_asyncio
nest_asyncio.apply()
import nest_asyncio
nest_asyncio.apply()
In [ ]:
Copied!
response = await agent.achat("What is (121 * 3) + 42?")
print(str(response))
response = await agent.achat("What is (121 * 3) + 42?")
print(str(response))
STARTING TURN 1 --------------- === Calling Function === Calling function: add with args: {"a":363,"b":42} Got output: 405 ======================== STARTING TURN 2 --------------- The result of (121 * 3) + 42 is 405.
In [ ]:
Copied!
response = await agent.astream_chat("What is (121 * 3) + 42?")
response_gen = response.response_gen
async for token in response.async_response_gen():
print(token, end="")
response = await agent.astream_chat("What is (121 * 3) + 42?")
response_gen = response.response_gen
async for token in response.async_response_gen():
print(token, end="")
STARTING TURN 1 --------------- === Calling Function === Calling function: multiply with args: {"a": 121, "b": 3} Got output: 363 ======================== === Calling Function === Calling function: add with args: {"a": 363, "b": 42} Got output: 405 ======================== STARTING TURN 2 --------------- The result of (121 * 3) + 42 is 405.
In [ ]:
Copied!
import json# 示例虚拟函数,硬编码为返回相同的天气# 在生产中,这可以是您的后端API或外部APIdef get_current_weather(location, unit="fahrenheit"): """获取给定位置的当前天气""" if "tokyo" in location.lower(): return json.dumps( {"location": location, "temperature": "10", "unit": "celsius"} ) elif "san francisco" in location.lower(): return json.dumps( {"location": location, "temperature": "72", "unit": "fahrenheit"} ) else: return json.dumps( {"location": location, "temperature": "22", "unit": "celsius"} )weather_tool = FunctionTool.from_defaults(fn=get_current_weather)
import json# 示例虚拟函数,硬编码为返回相同的天气# 在生产中,这可以是您的后端API或外部APIdef get_current_weather(location, unit="fahrenheit"): """获取给定位置的当前天气""" if "tokyo" in location.lower(): return json.dumps( {"location": location, "temperature": "10", "unit": "celsius"} ) elif "san francisco" in location.lower(): return json.dumps( {"location": location, "temperature": "72", "unit": "fahrenheit"} ) else: return json.dumps( {"location": location, "temperature": "22", "unit": "celsius"} )weather_tool = FunctionTool.from_defaults(fn=get_current_weather)
In [ ]:
Copied!
llm = OpenAI(model="gpt-3.5-turbo-1106")
agent = OpenAIAgent.from_tools([weather_tool], llm=llm, verbose=True)
response = agent.chat(
"What's the weather like in San Francisco, Tokyo, and Paris?"
)
llm = OpenAI(model="gpt-3.5-turbo-1106")
agent = OpenAIAgent.from_tools([weather_tool], llm=llm, verbose=True)
response = agent.chat(
"What's the weather like in San Francisco, Tokyo, and Paris?"
)
STARTING TURN 1 --------------- === Calling Function === Calling function: get_current_weather with args: {"location": "San Francisco", "unit": "fahrenheit"} Got output: {"location": "San Francisco", "temperature": "72", "unit": "fahrenheit"} ======================== === Calling Function === Calling function: get_current_weather with args: {"location": "Tokyo", "unit": "fahrenheit"} Got output: {"location": "Tokyo", "temperature": "10", "unit": "celsius"} ======================== === Calling Function === Calling function: get_current_weather with args: {"location": "Paris", "unit": "fahrenheit"} Got output: {"location": "Paris", "temperature": "22", "unit": "celsius"} ======================== STARTING TURN 2 ---------------
上述所有的函数调用都是在“助手”和“用户”之间的单个对话回合中完成的。有趣的是,GPT-3.5的旧版本并不像它的后继者那样先进 — 它会在3个单独的回合中完成上述任务。为了演示,以下是它的执行过程。
In [ ]:
Copied!
llm = OpenAI(model="gpt-3.5-turbo-0613")
agent = OpenAIAgent.from_tools([weather_tool], llm=llm, verbose=True)
response = agent.chat(
"What's the weather like in San Francisco, Tokyo, and Paris?"
)
llm = OpenAI(model="gpt-3.5-turbo-0613")
agent = OpenAIAgent.from_tools([weather_tool], llm=llm, verbose=True)
response = agent.chat(
"What's the weather like in San Francisco, Tokyo, and Paris?"
)
STARTING TURN 1 --------------- === Calling Function === Calling function: get_current_weather with args: { "location": "San Francisco" } Got output: {"location": "San Francisco", "temperature": "72", "unit": "fahrenheit"} ======================== STARTING TURN 2 --------------- === Calling Function === Calling function: get_current_weather with args: { "location": "Tokyo" } Got output: {"location": "Tokyo", "temperature": "10", "unit": "celsius"} ======================== STARTING TURN 3 --------------- === Calling Function === Calling function: get_current_weather with args: { "location": "Paris" } Got output: {"location": "Paris", "temperature": "22", "unit": "celsius"} ======================== STARTING TURN 4 ---------------
结论¶
因此,正如你所看到的,llama_index
库可以在用户和OpenAI代理之间的单个对话回合中处理多个函数调用(以及单个函数调用)!