khoj/documentation/docs/advanced
Debanjum 69ef6829c1 Simplify integrating Ollama, OpenAI proxies with Khoj on first run
- Integrate with Ollama or other openai compatible APIs by simply
  setting `OPENAI_API_BASE' environment variable in docker-compose etc.
- Update docs on integrating with Ollama, openai proxies on first run
- Auto populate all chat models supported by openai compatible APIs
- Auto set vision enabled for all commercial models

- Minor
  - Add huggingface cache to khoj_models volume. This is where chat
  models and (now) sentence transformer models are stored by default
  - Reduce verbosity of yarn install of web app. Otherwise hit docker
  log size limit & stops showing remaining logs after web app install
  - Suggest `ollama pull <model_name>` to start it in background
2024-11-17 02:08:20 -08:00
..
_category_.json Add Advanced Self Hosting Section, Improve Self Hosting, OpenAI Proxy Docs 2024-06-24 16:12:20 +05:30
admin.md Add link to self-hosted admin page and add docs for building front-end assets. Close #901 2024-10-22 22:42:27 -07:00
authentication.md Improve Self Hosting Docs. Better Docker, Remote Access Setup Instructions 2024-09-21 14:06:17 -07:00
litellm.md Remove need to set server chat settings from use openai proxies docs 2024-11-05 17:10:53 -08:00
lmstudio.md Remove need to set server chat settings from use openai proxies docs 2024-11-05 17:10:53 -08:00
ollama.mdx Simplify integrating Ollama, OpenAI proxies with Khoj on first run 2024-11-17 02:08:20 -08:00
remote.md Improve Self Hosting Docs. Better Docker, Remote Access Setup Instructions 2024-09-21 14:06:17 -07:00
support-multilingual-docs.md Add Advanced Self Hosting Section, Improve Self Hosting, OpenAI Proxy Docs 2024-06-24 16:12:20 +05:30
use-openai-proxy.md Remove need to set server chat settings from use openai proxies docs 2024-11-05 17:10:53 -08:00