Find answers from the community

Updated 11 months ago

Hi everyone! I am building a small app

Hi everyone! I am building a small app with llama-index using Azure OpenAI models with streaming, and now I want to add defaults messages when a user faces a disconnection, or OpenAI has a down-time, and so. Currently I am testing this by manually disconnection my net in the middle of the request, but as I tested, this could also be "simulated" setting max_retires:0 and timeout:0 when instantiating AzureOpenAI class. Regarless, I checked StreamingAgentChatResponse's write_response_to_history(...) method and it caught my eyes that if an exception occurs, the _is_function_not_none_thread_event is never set, which causes the whole request to wait infinitely
L
L
3 comments
sounds like a good fix to make in a PR? πŸ‘€
I wanted to make sure if what I am saying has some sense, I could evaluate to make a PR addressing this issue
I think it definitely makes sense. There should probably be a try/except that sets the done field
Add a reply
Sign up and join the conversation on Discord