khoj/documentation/docs/miscellaneous/credits.md
Debanjum Singh Solanky dcdd1edde2 Update docs to show how to setup llama-cpp with Khoj
- How to pip install khoj to run offline chat on GPU
  After migration to llama-cpp-python more GPU types are supported but
  require build step so mention how
- New default offline chat model
- Where to get supported chat models from on HuggingFace
2024-03-26 22:33:01 +05:30

949 B

sidebar_position
4

Credits

Many Open Source projects are used to power Khoj. Here's a few of them: