mirror of
https://github.com/khoj-ai/khoj.git
synced 2025-02-17 08:04:21 +00:00
- Use temperature of 0 by default for extract questions offline chat actor - Use temperature of 0.2 for send_message_to_model_offline (this is the default temperature set by llama.cpp) |
||
---|---|---|
.. | ||
interface | ||
khoj | ||
telemetry |