Find answers from the community

Updated 3 months ago

I see responses in the retrieve step of

I see responses in the retrieve step of a reAct chat engine. But when I attempt to see the sources via response.sources or response.source_nodes, I get empty lists. Any specific reason this happens and insights ?
L
V
10 comments
The react agent hasn't been updated to set the sources yet πŸ˜…
Would be an excellent PR though!
Logan, has this been implemented yet? Planning to take a look and see if I can try contributing, if this is still not implemented!
Still not implemented πŸ˜… Definitely take a look if you have the time!
Saw this PR (https://github.com/jerryjliu/llama_index/pull/6745), curious to know what was the issue with the agent changes that you had to undo them?
That's more related to the subquestion query engine. The interface and they way it was implemented is not great haha. Decided to log it through callbacks for that one.

Probably the best reference is how it's done for the openai agent
https://github.com/jerryjliu/llama_index/blob/647a9fff672b10f20c34f2b8b23a79930775c827/llama_index/agent/openai_agent.py#L224

Just adding ToolOutput objects to the overall response object essentially
That makes sense, thank you so much for the openai agent reference! Will start there.
Had one more question @Logan M (Sorry to bother you repeatedly, feel free to ignore if you're busy!) - if I were using a custom LLM from LangChain that basically does OpenAI calls (we have a custom library wrapper around OpenAI LLMs), I wouldn't be able to use it with OpenAI Agent in Llama Index, right?
Again, appreciate you and your help so so much!
Yea, the OpenAI agent relies on the function-calling API offered by openai πŸ€”
Add a reply
Sign up and join the conversation on Discord