Hi everyone! I am building a small app with llama-index using Azure OpenAI models with streaming, and now I want to add defaults messages when a user faces a disconnection, or OpenAI has a down-time, and so. Currently I am testing this by manually disconnection my net in the middle of the request, but as I tested, this could also be "simulated" setting max_retires:0 and timeout:0 when instantiating AzureOpenAI class. Regarless, I checked StreamingAgentChatResponse's write_response_to_history(...) method and it caught my eyes that if an exception occurs, the _is_function_not_none_thread_event is never set, which causes the whole request to wait infinitely