mirror of
https://github.com/khoj-ai/khoj.git
synced 2024-11-27 17:35:07 +01:00
Fix list references in use openai proxy docs
This commit is contained in:
parent
852662f946
commit
591c582eeb
2 changed files with 10 additions and 10 deletions
|
@ -21,17 +21,17 @@ For specific integrations, see our [Ollama](/advanced/ollama), [LMStudio](/advan
|
||||||
## General Setup
|
## General Setup
|
||||||
|
|
||||||
1. Start your preferred OpenAI API compatible app
|
1. Start your preferred OpenAI API compatible app
|
||||||
3. Create a new [OpenAI Processor Conversation Config](http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/add) on your Khoj admin panel
|
2. Create a new [OpenAI Processor Conversation Config](http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/add) on your Khoj admin panel
|
||||||
- Name: `proxy-name`
|
- Name: `any name`
|
||||||
- Api Key: `any string`
|
- Api Key: `any string`
|
||||||
- Api Base Url: **URL of your Openai Proxy API**
|
- Api Base Url: **URL of your Openai Proxy API**
|
||||||
4. Create a new [Chat Model Option](http://localhost:42110/server/admin/database/chatmodeloptions/add) on your Khoj admin panel.
|
3. Create a new [Chat Model Option](http://localhost:42110/server/admin/database/chatmodeloptions/add) on your Khoj admin panel.
|
||||||
- Name: `llama3` (replace with the name of your local model)
|
- Name: `llama3` (replace with the name of your local model)
|
||||||
- Model Type: `Openai`
|
- Model Type: `Openai`
|
||||||
- Openai Config: `<the proxy config you created in step 3>`
|
- Openai Config: `<the proxy config you created in step 2>`
|
||||||
- Max prompt size: `2000` (replace with the max prompt size of your model)
|
- Max prompt size: `2000` (replace with the max prompt size of your model)
|
||||||
- Tokenizer: *Do not set for OpenAI, mistral, llama3 based models*
|
- Tokenizer: *Do not set for OpenAI, mistral, llama3 based models*
|
||||||
5. Create a new [Server Chat Setting](http://localhost:42110/server/admin/database/serverchatsettings/add/) on your Khoj admin panel
|
4. Create a new [Server Chat Setting](http://localhost:42110/server/admin/database/serverchatsettings/add/) on your Khoj admin panel
|
||||||
- Default model: `<name of chat model option you created in step 4>`
|
- Default model: `<name of chat model option you created in step 3>`
|
||||||
- Summarizer model: `<name of chat model option you created in step 4>`
|
- Summarizer model: `<name of chat model option you created in step 3>`
|
||||||
6. Go to [your config](http://localhost:42110/settings) and select the model you just created in the chat model dropdown.
|
5. Go to [your config](http://localhost:42110/settings) and select the model you just created in the chat model dropdown.
|
||||||
|
|
|
@ -260,13 +260,13 @@ Using Ollama? See the [Ollama Integration](/advanced/ollama) section for more cu
|
||||||
1. Create a new [OpenAI processor conversation config](http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/add) in the server admin settings. This is kind of a misnomer, we know.
|
1. Create a new [OpenAI processor conversation config](http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/add) in the server admin settings. This is kind of a misnomer, we know.
|
||||||
- Add your [OpenAI API key](https://platform.openai.com/api-keys)
|
- Add your [OpenAI API key](https://platform.openai.com/api-keys)
|
||||||
- Give the configuration a friendly name like `OpenAI`
|
- Give the configuration a friendly name like `OpenAI`
|
||||||
- (Optional) Set the API base URL. It is only relevant if you're using another OpenAI-compatible proxy server like [Ollama](/advanced/ollama) or [LMStudio](/advanced/lmstudio).
|
- (Optional) Set the API base URL. It is only relevant if you're using another OpenAI-compatible proxy server like [Ollama](/advanced/ollama) or [LMStudio](/advanced/lmstudio).<br />
|
||||||
![example configuration for openai processor](/img/example_openai_processor_config.png)
|
![example configuration for openai processor](/img/example_openai_processor_config.png)
|
||||||
2. Create a new [chat model options](http://localhost:42110/server/admin/database/chatmodeloptions/add)
|
2. Create a new [chat model options](http://localhost:42110/server/admin/database/chatmodeloptions/add)
|
||||||
- Set the `chat-model` field to an [OpenAI chat model](https://platform.openai.com/docs/models). Example: `gpt-4o`.
|
- Set the `chat-model` field to an [OpenAI chat model](https://platform.openai.com/docs/models). Example: `gpt-4o`.
|
||||||
- Make sure to set the `model-type` field to `OpenAI`.
|
- Make sure to set the `model-type` field to `OpenAI`.
|
||||||
- If your model supports vision, set the `vision enabled` field to `true`. This is currently only supported for OpenAI models with vision capabilities.
|
- If your model supports vision, set the `vision enabled` field to `true`. This is currently only supported for OpenAI models with vision capabilities.
|
||||||
- The `tokenizer` and `max-prompt-size` fields are optional. Set them only if you're sure of the tokenizer or token limit for the model you're using. Contact us if you're unsure what to do here.
|
- The `tokenizer` and `max-prompt-size` fields are optional. Set them only if you're sure of the tokenizer or token limit for the model you're using. Contact us if you're unsure what to do here.<br />
|
||||||
![example configuration for chat model options](/img/example_chatmodel_option.png)
|
![example configuration for chat model options](/img/example_chatmodel_option.png)
|
||||||
</TabItem>
|
</TabItem>
|
||||||
<TabItem value="anthropic" label="Anthropic">
|
<TabItem value="anthropic" label="Anthropic">
|
||||||
|
|
Loading…
Reference in a new issue