langchain.agents.agent_toolkits.conversational_retrieval.openai_functions.create_conversational_retrieval_agentΒΆ

langchain.agents.agent_toolkits.conversational_retrieval.openai_functions.create_conversational_retrieval_agent(llm: BaseLanguageModel, tools: List[BaseTool], remember_intermediate_steps: bool = True, memory_key: str = 'chat_history', system_message: Optional[SystemMessage] = None, verbose: bool = False, max_token_limit: int = 2000, **kwargs: Any) AgentExecutor[source]ΒΆ

A convenience method for creating a conversational retrieval agent.

Parameters
  • llm (BaseLanguageModel) – The language model to use, should be ChatOpenAI

  • tools (List[BaseTool]) – A list of tools the agent has access to

  • remember_intermediate_steps (bool) – Whether the agent should remember intermediate steps or not. Intermediate steps refer to prior action/observation pairs from previous questions. The benefit of remembering these is if there is relevant information in there, the agent can use it to answer follow up questions. The downside is it will take up more tokens.

  • memory_key (str) – The name of the memory key in the prompt.

  • system_message (Optional[SystemMessage]) – The system message to use. By default, a basic one will be used.

  • verbose (bool) – Whether or not the final AgentExecutor should be verbose or not, defaults to False.

  • max_token_limit (int) – The max number of tokens to keep around in memory. Defaults to 2000.

  • kwargs (Any) –

Returns

An agent executor initialized appropriately

Return type

AgentExecutor

Examples using create_conversational_retrieval_agentΒΆ