This was previously required, but now it's only usefuly for more
advanced settings, not typical for self-hosting users.
With recent updates, the user's selected chat model is used for both
Khoj's train of thought and response. This makes it easy to
switch your preferred chat model directly from the user settings
page and not have to update this in the admin panel as well.
Reflect these code changse in the docs, by removing the unnecessary
step for self-hosted users to create a server chat setting when using
an OpenAI proxy service like Ollama, LiteLLM etc.
This is an initial pass to add documentation for all the knobs
available on the Khoj Admin panel.
It should shed some light onto what each admin setting is for and how
they can be customized when self hosting.
Resolves#831
- Improve Self Hosting Docker Instructions
- Ask to Install Docker Desktop to not require separate
docker-compose install and unify the instruction across OS
- To Self Host on Windows, ask to use Docker Desktop with WSL2 backend
- Use nested Tab grouping to split Docker vs Pip Self Host Instructions
- Reduce Self Host Setup Steps in Documentation after code simplification
- First run now avoids need to configure Khoj via admin panel
- So move the chat model config steps into optional post setup
config section
- Improve Instructions to Configure chat models on First Run
- Compress configuring chat model providers into a Tab Group
- Add Documentation for Remote Access under Advanced Self Hosting
- Update references to the settings page to use new url across docs
and code
- Rename desktop and web settings page to settigns.html instead of
config[ure].html
- Deprecate khoj-assistant pypi package. Use more accurate and
succinct pypi project name, khoj
- Update references to sye khoj pypi package in docs and code instead
of the legacy khoj-assistant pypi package
- Update pypi workflow to publish to both khoj, khoj-assistant for now
- Update stale python 3.9 support mentioned in our pyproject. Can't
support python 3.9 as depend on latest django which support >=3.10
- Add instructions for self-hosted users with info, warning boxes to
avoid, fix common issues when setting up Khoj server
- Create new Advanced Self Hosting section
- Extract Advanced Self-Hosting Sections from the Advanced Page and
move them to separate Pages under Advanced Self Hosting section
- Improve OpenAI Proxy Docs
- Put Ollama setup as a section under OpenAI API Proxy page instead
of a separate page
- Add Section to use Khoj with chat model from LM Studio
- Update LiteLLM docs to use chat model from LM Studio