langchain.agents.mrkl.base
.ZeroShotAgent¶
- class langchain.agents.mrkl.base.ZeroShotAgent[source]¶
Bases:
Agent
[Deprecated] MRKL链的代理。
Notes
Deprecated since version 0.1.0: Use create_react_agent instead.
Create a new model by parsing and validating input data from keyword arguments.
Raises ValidationError if the input data cannot be parsed to form a valid model.
- param allowed_tools: Optional[List[str]] = None¶
- param llm_chain: langchain.chains.llm.LLMChain [Required]¶
- param output_parser: langchain.agents.agent.AgentOutputParser [Optional]¶
- async aplan(intermediate_steps: List[Tuple[AgentAction, str]], callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) Union[AgentAction, AgentFinish] ¶
给定输入,决定要做什么。
- 参数:
- intermediate_steps: LLM到目前为止所采取的步骤,
以及观察结果
callbacks: 要运行的回调函数。 **kwargs: 用户输入。
- 返回:
指定要使用的工具。
- Parameters
intermediate_steps (List[Tuple[AgentAction, str]]) –
callbacks (Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]]) –
kwargs (Any) –
- Return type
Union[AgentAction, AgentFinish]
- classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) Model ¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
- Parameters
_fields_set (Optional[SetStr]) –
values (Any) –
- Return type
Model
- copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) Model ¶
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to include in new model
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to exclude from new model, as with values this takes precedence over include
update (Optional[DictStrAny]) – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep (bool) – set to True to make a deep copy of the model
self (Model) –
- Returns
new model instance
- Return type
Model
- classmethod create_prompt(tools: Sequence[BaseTool], prefix: str = 'Answer the following questions as best you can. You have access to the following tools:', suffix: str = 'Begin!\n\nQuestion: {input}\nThought:{agent_scratchpad}', format_instructions: str = 'Use the following format:\n\nQuestion: the input question you must answer\nThought: you should always think about what to do\nAction: the action to take, should be one of [{tool_names}]\nAction Input: the input to the action\nObservation: the result of the action\n... (this Thought/Action/Action Input/Observation can repeat N times)\nThought: I now know the final answer\nFinal Answer: the final answer to the original input question', input_variables: Optional[List[str]] = None) PromptTemplate [source]¶
创建与零shot代理风格相似的提示。
- 参数:
tools:代理将可以访问的工具列表,用于格式化提示。 prefix:工具列表之前要放置的字符串。 suffix:工具列表之后要放置的字符串。 input_variables:最终提示将期望的输入变量列表。
- 返回:
从这里的各个部分组装而成的PromptTemplate。
- Parameters
tools (Sequence[BaseTool]) –
prefix (str) –
suffix (str) –
format_instructions (str) –
input_variables (Optional[List[str]]) –
- Return type
- dict(**kwargs: Any) Dict ¶
返回代理的字典表示。
- Parameters
kwargs (Any) –
- Return type
Dict
- classmethod from_llm_and_tools(llm: BaseLanguageModel, tools: Sequence[BaseTool], callback_manager: Optional[BaseCallbackManager] = None, output_parser: Optional[AgentOutputParser] = None, prefix: str = 'Answer the following questions as best you can. You have access to the following tools:', suffix: str = 'Begin!\n\nQuestion: {input}\nThought:{agent_scratchpad}', format_instructions: str = 'Use the following format:\n\nQuestion: the input question you must answer\nThought: you should always think about what to do\nAction: the action to take, should be one of [{tool_names}]\nAction Input: the input to the action\nObservation: the result of the action\n... (this Thought/Action/Action Input/Observation can repeat N times)\nThought: I now know the final answer\nFinal Answer: the final answer to the original input question', input_variables: Optional[List[str]] = None, **kwargs: Any) Agent [source]¶
从LLM和工具构建一个代理。
- Parameters
llm (BaseLanguageModel) –
tools (Sequence[BaseTool]) –
callback_manager (Optional[BaseCallbackManager]) –
output_parser (Optional[AgentOutputParser]) –
prefix (str) –
suffix (str) –
format_instructions (str) –
input_variables (Optional[List[str]]) –
kwargs (Any) –
- Return type
- classmethod from_orm(obj: Any) Model ¶
- Parameters
obj (Any) –
- Return type
Model
- get_allowed_tools() Optional[List[str]] ¶
- Return type
Optional[List[str]]
- get_full_inputs(intermediate_steps: List[Tuple[AgentAction, str]], **kwargs: Any) Dict[str, Any] ¶
为LLMChain从中间步骤创建完整的输入。
- Parameters
intermediate_steps (List[Tuple[AgentAction, str]]) –
kwargs (Any) –
- Return type
Dict[str, Any]
- json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Optional[Callable[[Any], Any]] = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode ¶
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
- Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) –
by_alias (bool) –
skip_defaults (Optional[bool]) –
exclude_unset (bool) –
exclude_defaults (bool) –
exclude_none (bool) –
encoder (Optional[Callable[[Any], Any]]) –
models_as_dict (bool) –
dumps_kwargs (Any) –
- Return type
unicode
- classmethod parse_file(path: Union[str, Path], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- Parameters
path (Union[str, Path]) –
content_type (unicode) –
encoding (unicode) –
proto (Protocol) –
allow_pickle (bool) –
- Return type
Model
- classmethod parse_obj(obj: Any) Model ¶
- Parameters
obj (Any) –
- Return type
Model
- classmethod parse_raw(b: Union[str, bytes], *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- Parameters
b (Union[str, bytes]) –
content_type (unicode) –
encoding (unicode) –
proto (Protocol) –
allow_pickle (bool) –
- Return type
Model
- plan(intermediate_steps: List[Tuple[AgentAction, str]], callbacks: Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]] = None, **kwargs: Any) Union[AgentAction, AgentFinish] ¶
给定输入,决定要做什么。
- 参数:
- intermediate_steps: LLM到目前为止所采取的步骤,
以及观察结果
callbacks: 要运行的回调函数。 **kwargs: 用户输入。
- 返回:
指定要使用的工具。
- Parameters
intermediate_steps (List[Tuple[AgentAction, str]]) –
callbacks (Optional[Union[List[BaseCallbackHandler], BaseCallbackManager]]) –
kwargs (Any) –
- Return type
Union[AgentAction, AgentFinish]
- return_stopped_response(early_stopping_method: str, intermediate_steps: List[Tuple[AgentAction, str]], **kwargs: Any) AgentFinish ¶
当代理由于达到最大迭代次数而停止时返回响应。
- Parameters
early_stopping_method (str) –
intermediate_steps (List[Tuple[AgentAction, str]]) –
kwargs (Any) –
- Return type
- save(file_path: Union[Path, str]) None ¶
保存代理。
- 参数:
file_path:保存代理的文件路径。
示例: .. code-block:: python
# 如果使用代理执行器 agent.agent.save(file_path=”path/agent.yaml”)
- Parameters
file_path (Union[Path, str]) –
- Return type
None
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny ¶
- Parameters
by_alias (bool) –
ref_template (unicode) –
- Return type
DictStrAny
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode ¶
- Parameters
by_alias (bool) –
ref_template (unicode) –
dumps_kwargs (Any) –
- Return type
unicode
- tool_run_logging_kwargs() Dict ¶
- Return type
Dict
- classmethod update_forward_refs(**localns: Any) None ¶
Try to update ForwardRefs on fields based on this Model, globalns and localns.
- Parameters
localns (Any) –
- Return type
None
- classmethod validate(value: Any) Model ¶
- Parameters
value (Any) –
- Return type
Model
- property input_keys: List[str]¶
返回输入的键。
- 元数据 私有
- property llm_prefix: str¶
在llm调用前附加的前缀。
- property observation_prefix: str¶
要附加到观测值前面的前缀。
- property return_values: List[str]¶
代理的返回值。