Find answers from the community

Updated 2 years ago

I was checking `index as chat engine `

I was checking index.as_chat_engine() Had some doubts

What will happen if history context gets bigger? Will it remove some previous conversation from it so that the openAI is able to predict on it?

If not I'd be happy to work on it to create a PR
L
W
7 comments
The react chat engine will use a memory module from langchain, and they have certain ones that will work for longer chat histories

The others may hit some issues though. A PR for this would be awesome πŸ™‚
Got it, Will work on this.
Also found one more case!
ChatEngine is suppose to be designed to be used in QA bot. We should have a chat memory map for each different user. As different user will have different context and that is not yet supported in my opinion.
If we get some sort of user_id or chat_id we can map it with their own context. if none passed we already have current scenario
I'll be working on these two parts
Add a reply
Sign up and join the conversation on Discord