LLM输入输出适配器#

class langchain_aws.llms.bedrock.LLMInputOutputAdapter[来源]#

适配器类,用于将来自Langchain的输入准备成LLM模型期望的格式。

它还提供了辅助函数来从模型响应中提取生成的文本。

属性

provider_to_output_key_map

方法

aprepare_output_stream(provider, response[, ...])

prepare_input(provider, model_kwargs[, ...])

prepare_output(provider, response)

prepare_output_stream(provider, response[, ...])

classmethod aprepare_output_stream(provider: str, response: Any, stop: List[str] | None = None, messages_api: bool = False, coerce_content_to_string: bool = False) AsyncIterator[GenerationChunk | AIMessageChunk][source]#
Parameters:
  • provider (str)

  • response (Any)

  • stop (列表[字符串] | )

  • messages_api (bool)

  • coerce_content_to_string (bool)

Return type:

AsyncIterator[GenerationChunk | AIMessageChunk]

classmethod prepare_input(provider: str, model_kwargs: Dict[str, Any], prompt: str | None = None, system: str | None = None, messages: List[Dict] | None = None, tools: List[AnthropicTool] | None = None, *, max_tokens: int | None = None, temperature: float | None = None) Dict[str, Any][source]#
Parameters:
  • provider (str)

  • model_kwargs (Dict[str, Any])

  • prompt (str | None)

  • system (str | None)

  • messages (列表[字典] | )

  • 工具 (列表[AnthropicTool] | )

  • max_tokens (int | None)

  • temperature (float | None)

Return type:

Dict[str, Any]

classmethod prepare_output(provider: str, response: Any) dict[source]#
Parameters:
  • provider (str)

  • response (Any)

Return type:

字典

classmethod prepare_output_stream(provider: str, response: Any, stop: List[str] | None = None, messages_api: bool = False, coerce_content_to_string: bool = False) Iterator[GenerationChunk | AIMessageChunk][source]#
Parameters:
  • provider (str)

  • response (Any)

  • stop (列表[字符串] | )

  • messages_api (bool)

  • coerce_content_to_string (bool)

Return type:

迭代器[GenerationChunk | AIMessageChunk]