Is there any human feedback loop possible such that it can understand what answers the chat engine gave were good and some were not so good and so should take care in future response. We are giving feedback of good and bad response based on like and dislike button for each response @Logan M @WhiteFang_Jr
I have created a chatbot using llama index chatengine…Now I want to remember the user if he had previously used the chatbot and show him the last history of conversation he had
Currently, I get a new conversation I’d when new connection is made from same user. I want to remember the user, can I remember from IP or any other way and show him all the conversations he had before @Logan M
@Logan M can I get to know current available context size and memory used and max tokens used at runtime so that when it approaches the limit, I can reset the variables and the chat engine so that it doesn’t reach the limit and break with error
I have added custom prompt to the chat engine. But it seems to not work. I have named it as Bluebird, still it refers itself as AI language model @Logan M
I am currently building a chatbot that reads data from different pdfs. I use chatengine to retrieve queries. chat engine keeps the context, but when multiple people are using the bot at the same time, the context is shared in the same memory.
I want every session to have individual memory, currently user 1 is asking about x, user 2 asking about y, then when user 1 is asking about tell me more, it says more about y instead of x because the last context it has is about y. It is not taking into account individual user’s context. Can I please get some help on this @Logan M