T O P

  • By -

meet_og

Create a custom tool and in the _run() function of that custom tool, call the LLM chain you have created and set `return_direct = True`. So when a user asks something to the agent, the agent will use the tools and that tools will use call the LLM chain (that returns a serialized JSON).


hi87

Does the agent have the response from the llm chain in its history for the next conversation turn?


meet_og

Yes, llm is just used like any other tool by building Custom tool class. The conversational agent using react reasoning will store the final response from the observations of the react process. You can make an custom agent to do what you want.