Commit graph

17 commits

Author SHA1 Message Date
sabaimran
408f4780ce Add and update documentation for setting up khoj with an openai proxy server or offline llm 2024-04-27 20:16:32 +05:30
Debanjum Singh Solanky
e9f608174b Fix access to Khoj admin panel from non HTTPS custom domains
To access the Khoj admin panel from a non HTTPS custom domain the
`KHOJ_NO_SSL' and `KHOJ_DOMAIN' env vars need to be explictly set.

See the updated setup docs for details.

Resolves #662
2024-04-17 03:20:05 +05:30
Debanjum Singh Solanky
689202e00e Update recommended CMAKE flag to enable using CUDA on linux in Docs 2024-04-14 02:35:27 +05:30
Debanjum Singh Solanky
c6487f2e48 Fix docs showing how to setup llama-cpp with Khoj 2024-03-31 15:36:40 +05:30
Debanjum Singh Solanky
886d49e3a4 Merge branch 'master' into migrate-to-llama-cpp-for-offline-chat 2024-03-31 00:59:20 +05:30
sabaimran
dd2a3f712b Add more demo videos, images, add feature sections 2024-03-30 14:48:46 +05:30
sabaimran
4cb91a042e Add an agents feature page, and clarification around custom domains 2024-03-30 14:20:46 +05:30
Debanjum Singh Solanky
dcdd1edde2 Update docs to show how to setup llama-cpp with Khoj
- How to pip install khoj to run offline chat on GPU
  After migration to llama-cpp-python more GPU types are supported but
  require build step so mention how
- New default offline chat model
- Where to get supported chat models from on HuggingFace
2024-03-26 22:33:01 +05:30
Debanjum Singh Solanky
7211eb9cf5 Default to gpt-4-turbo-preview for chat model, extract questions actor
GPT-4 is more expensive and generally less capable than gpt-4-turbo-preview
2024-03-14 01:22:33 +05:30
Debanjum Singh Solanky
57dce91c91 Document using venv to handle dependency conflict on khoj pip install
Resolves #276
2024-02-23 02:07:08 +05:30
Debanjum Singh Solanky
c40f642afa Move Use OpenAI Compatible LLM Server section to existing advanced page
Add footnote on supported chat models to the self-hosting section
2024-02-04 16:16:55 +05:30
Debanjum Singh Solanky
523af5b3aa Fix docs. Chat model options need to be set if using OpenAI proxy server 2024-02-04 06:42:05 +05:30
Debanjum Singh Solanky
474afa5efe Document using OpenAI-compatible LLM API server for Khoj chat
This allows using open or commerical, local or hosted LLM models that
are not supported in Khoj by default.

It also allows users to use other local LLM API servers that support
their GPU

Closes #407
2024-02-02 10:31:27 +05:30
Debanjum Singh Solanky
e05474e7e0 Say when max-prompt, tokenizer fields needs setup in self-host docs 2024-01-31 08:42:22 +05:30
sabaimran
9ad44f0e77
Include info about privacy in the docs (#631)
* Add a page about privacy and organize some of the documentation
* Add notice about telemetry
* Improve copy for privacy section, link to telemetry section
2024-01-29 17:47:23 +05:30
Debanjum Singh Solanky
d920e4d0a7 Make the docs overview page as the main docs landing page
- Make the docs overview page available at docs.khoj.dev root instead of
under docs.khoj.dev/docs path
  - Remove the new landing page, it is unnecessary.
- Remove /docs path prefix from links to internal doc pages
- Remove .md path suffix in internal doc pages for consistency
2024-01-08 01:13:42 +05:30
sabaimran
9b991eb4fe
Migrate to using docusaurus, rather than docsify for documentation (#603)
* Add docusaurus documentation (to replace the docsify setup
* Remove older docs
* Specify documentation as the gh pages build action working directory
2024-01-07 20:28:15 +05:30
Renamed from docs/setup.md (Browse further)