From 037e15764882a28df3e5a73a998c5814d2b734f8 Mon Sep 17 00:00:00 2001 From: sabaimran Date: Mon, 8 Jul 2024 16:49:13 +0530 Subject: [PATCH] Fix a variety of links --- documentation/docs/clients/desktop.md | 2 +- documentation/docs/get-started/setup.mdx | 8 +++----- 2 files changed, 4 insertions(+), 6 deletions(-) diff --git a/documentation/docs/clients/desktop.md b/documentation/docs/clients/desktop.md index 3e146a9a..6c089424 100644 --- a/documentation/docs/clients/desktop.md +++ b/documentation/docs/clients/desktop.md @@ -15,7 +15,7 @@ Khoj will keep these files in sync to provide contextual responses when you sear - **Faster answers**: Find answers quickly, from your private notes or the public internet - **Assisted creativity**: Smoothly weave across retrieving answers and generating content - **Iterative discovery**: Iteratively explore and re-discover your notes - - **Quick access**: Use [Khoj Mini](/features/desktop_shortcut) on the desktop to quickly pull up a mini chat module for quicker answers + - **Quick access**: Use [Khoj Mini](/features/khoj_mini) on the desktop to quickly pull up a mini chat module for quicker answers - **Search** - **Natural**: Advanced natural language understanding using Transformer based ML Models - **Incremental**: Incremental search for a fast, search-as-you-type experience diff --git a/documentation/docs/get-started/setup.mdx b/documentation/docs/get-started/setup.mdx index a2a01907..5f739d6f 100644 --- a/documentation/docs/get-started/setup.mdx +++ b/documentation/docs/get-started/setup.mdx @@ -210,7 +210,7 @@ Add a `ServerChatSettings` with `Default` and `Summarizer` fields set to your pr ##### Configure OpenAI Chat :::info[Ollama Integration] -Using Ollama? See the [Ollama Integration](/advanced/use-openai-proxy#ollama) section for more custom setup instructions. +Using Ollama? See the [Ollama Integration](/advanced/ollama) section for more custom setup instructions. ::: 1. Go to the [OpenAI settings](http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/) in the server admin settings to add an OpenAI processor conversation config. This is where you set your API key and server API base URL. The API base URL is optional - it's only relevant if you're using another OpenAI-compatible proxy server. @@ -227,11 +227,9 @@ Any chat model on Huggingface in GGUF format can be used for local chat. Here's - The `tokenizer` and `max-prompt-size` fields are optional. You can set these for non-standard models (i.e not Mistral or Llama based models) or when you know the token limit of the model to improve context stuffing. #### Share your data -You can sync your files and folders with Khoj using the [Desktop](/get-started/setup#2-download-the-desktop-client), Obsidian, or Emacs clients or just drag and drop specific files on the Web client Here's how you can do it: -1. Select files and folders to index [using the desktop client]. When you click 'Save', the files will be sent to your server for indexing. - - Select Notion workspaces and Github repositories to index using the web interface. +You can sync your files and folders with Khoj using the [Desktop](/clients/desktop#setup), [Obsidian](/clients/obsidian#setup), or [Emacs](/clients/emacs#setup) clients or just drag and drop specific files on the [website](/clients/web#upload-documents). You can also directly sync your [Notion workspace](/data-sources/notion_integration). -[^1]: Khoj, by default, can use [OpenAI GPT3.5+ chat models](https://platform.openai.com/docs/models/overview) or [GGUF chat models](https://huggingface.co/models?library=gguf). See [this section](/miscellaneous/advanced#use-openai-compatible-llm-api-server-self-hosting) on how to locally use OpenAI-format compatible proxy servers. +[^1]: Khoj, by default, can use [OpenAI GPT3.5+ chat models](https://platform.openai.com/docs/models/overview) or [GGUF chat models](https://huggingface.co/models?library=gguf). See [this section](/advanced/use-openai-proxy) on how to locally use OpenAI-format compatible proxy servers. ### 3. Use Khoj 🚀