khoj/documentation/docs
Debanjum 69ef6829c1 Simplify integrating Ollama, OpenAI proxies with Khoj on first run
- Integrate with Ollama or other openai compatible APIs by simply
  setting `OPENAI_API_BASE' environment variable in docker-compose etc.
- Update docs on integrating with Ollama, openai proxies on first run
- Auto populate all chat models supported by openai compatible APIs
- Auto set vision enabled for all commercial models

- Minor
  - Add huggingface cache to khoj_models volume. This is where chat
  models and (now) sentence transformer models are stored by default
  - Reduce verbosity of yarn install of web app. Otherwise hit docker
  log size limit & stops showing remaining logs after web app install
  - Suggest `ollama pull <model_name>` to start it in background
2024-11-17 02:08:20 -08:00
..
advanced Simplify integrating Ollama, OpenAI proxies with Khoj on first run 2024-11-17 02:08:20 -08:00
clients Create explicit flow to enable the free trial (#944) 2024-10-23 15:29:23 -07:00
contributing Add link to self-hosted admin page and add docs for building front-end assets. Close #901 2024-10-22 22:42:27 -07:00
data-sources Update the documentation with swanky new demo videos 2024-09-11 19:57:10 -07:00
features Add documentation for how to use the text to image model configs, reduce to Replicate 2024-11-15 15:26:14 -08:00
get-started Simplify integrating Ollama, OpenAI proxies with Khoj on first run 2024-11-17 02:08:20 -08:00
miscellaneous Allow disabling Khoj telemetry via KHOJ_TELEMETRY_DISABLE env var 2024-11-11 19:17:39 -08:00