khoj/documentation/docs/get-started
Debanjum Singh Solanky dcdd1edde2 Update docs to show how to setup llama-cpp with Khoj
- How to pip install khoj to run offline chat on GPU
  After migration to llama-cpp-python more GPU types are supported but
  require build step so mention how
- New default offline chat model
- Where to get supported chat models from on HuggingFace
2024-03-26 22:33:01 +05:30
..
_category_.json Migrate to using docusaurus, rather than docsify for documentation (#603) 2024-01-07 20:28:15 +05:30
demos.md Migrate to using docusaurus, rather than docsify for documentation (#603) 2024-01-07 20:28:15 +05:30
overview.md Use Khoj Client, Data sources diagrams in feature docs 2024-01-08 01:58:57 +05:30
privacy_security.md Include info about privacy in the docs (#631) 2024-01-29 17:47:23 +05:30
setup.mdx Update docs to show how to setup llama-cpp with Khoj 2024-03-26 22:33:01 +05:30