mirror of
https://github.com/khoj-ai/khoj.git
synced 2024-11-23 23:48:56 +01:00
69ef6829c1
- Integrate with Ollama or other openai compatible APIs by simply setting `OPENAI_API_BASE' environment variable in docker-compose etc. - Update docs on integrating with Ollama, openai proxies on first run - Auto populate all chat models supported by openai compatible APIs - Auto set vision enabled for all commercial models - Minor - Add huggingface cache to khoj_models volume. This is where chat models and (now) sentence transformer models are stored by default - Reduce verbosity of yarn install of web app. Otherwise hit docker log size limit & stops showing remaining logs after web app install - Suggest `ollama pull <model_name>` to start it in background |
||
---|---|---|
.. | ||
_category_.json | ||
admin.md | ||
authentication.md | ||
litellm.md | ||
lmstudio.md | ||
ollama.mdx | ||
remote.md | ||
support-multilingual-docs.md | ||
use-openai-proxy.md |