create_structured_runnable#

langchain_google_vertexai.chains.create_structured_runnable(function: Type[BaseModel] | Sequence[Type[BaseModel]], llm: Runnable, *, prompt: BasePromptTemplate | None = None, use_extra_step: bool = False) Runnable[source]#

创建一个使用OpenAI函数的可运行序列。

Parameters:
  • function (Type[BaseModel] | Sequence[Type[BaseModel]]) – 可以是一个单一的 pydantic.BaseModel 类或一个 pydantic.BaseModels 类的序列。 为了获得最佳效果,pydantic.BaseModels 应该包含参数的描述。

  • llm (Runnable) – 使用的语言模型,假设支持Google Vertex函数调用API。

  • prompt (BasePromptTemplate | None) – 传递给模型的BasePromptTemplate。

  • use_extra_step (bool) – 是否进行额外步骤以将输出解析为函数

Returns:

一个可运行的序列,当运行时将给定的函数传递给模型。

Return type:

Runnable

示例

from typing import Optional

from langchain_google_vertexai import ChatVertexAI, create_structured_runnable
from langchain_core.prompts import ChatPromptTemplate
from pydantic import BaseModel, Field


class RecordPerson(BaseModel):
    """Record some identifying information about a person."""

    name: str = Field(..., description="The person's name")
    age: int = Field(..., description="The person's age")
    fav_food: Optional[str] = Field(None, description="The person's favorite food")


class RecordDog(BaseModel):
    """Record some identifying information about a dog."""

    name: str = Field(..., description="The dog's name")
    color: str = Field(..., description="The dog's color")
    fav_food: Optional[str] = Field(None, description="The dog's favorite food")


llm = ChatVertexAI(model_name="gemini-pro")
prompt = ChatPromptTemplate.from_template("""
You are a world class algorithm for recording entities.
Make calls to the relevant function to record the entities in the following input: {input}
Tip: Make sure to answer in the correct format"""
                         )
chain = create_structured_runnable([RecordPerson, RecordDog], llm, prompt=prompt)
chain.invoke({"input": "Harry was a chubby brown beagle who loved chicken"})
# -> RecordDog(name="Harry", color="brown", fav_food="chicken")