mirror of
https://github.com/khoj-ai/khoj.git
synced 2024-11-23 15:38:55 +01:00
Properly close chat stream iterator even if response generation fails
Previously chat stream iterator wasn't closed when response streaming for offline chat model threw an exception. This would require restarting the application. Now application doesn't hang even if current response generation fails with exception
This commit is contained in:
parent
bdb81260ac
commit
5927ca8032
1 changed files with 1 additions and 1 deletions
|
@ -224,7 +224,7 @@ def llm_thread(g, messages: List[ChatMessage], model: Any, max_prompt_size: int
|
|||
g.send(response["choices"][0]["delta"].get("content", ""))
|
||||
finally:
|
||||
state.chat_lock.release()
|
||||
g.close()
|
||||
g.close()
|
||||
|
||||
|
||||
def send_message_to_model_offline(
|
||||
|
|
Loading…
Reference in a new issue