mirror of
https://github.com/khoj-ai/khoj.git
synced 2024-11-23 23:48:56 +01:00
Move some gifs to the assets s3 bucket and add instructions for Ollama, shareable conversations
This commit is contained in:
parent
e23c803cee
commit
e2922968d6
9 changed files with 49 additions and 4 deletions
Binary file not shown.
Before Width: | Height: | Size: 14 MiB |
Binary file not shown.
Before Width: | Height: | Size: 9.6 MiB |
|
@ -25,7 +25,7 @@ You can upload documents to Khoj from the web interface, one at a time. This is
|
|||
1. You can drag and drop the document into the chat window.
|
||||
2. Or click the paperclip icon in the chat window and select the document from your file system.
|
||||
|
||||
![demo of dragging and dropping a file](https://khoj-web-bucket.s3.amazonaws.com/drag_drop_file.gif)
|
||||
![demo of dragging and dropping a file](https://assets.khoj.dev/drag_drop_file.gif)
|
||||
|
||||
### Install on Phone
|
||||
You can optionally install Khoj as a [Progressive Web App (PWA)](https://web.dev/learn/pwa/installation). This makes it quick and easy to access Khoj on your phone.
|
||||
|
|
|
@ -4,7 +4,7 @@ The Notion integration allows you to search/chat with your Notion workspaces. [N
|
|||
|
||||
Go to https://app.khoj.dev/config to connect your Notion workspace(s) to Khoj.
|
||||
|
||||
![notion_integration](/img/notion_integration.gif)
|
||||
![notion_integration](https://assets.khoj.dev/notion_integration.gif)
|
||||
|
||||
|
||||
## Self-Hosted Setup
|
||||
|
|
|
@ -6,7 +6,7 @@ sidebar_position: 4
|
|||
|
||||
You can use agents to setup custom system prompts with Khoj. The server host can setup their own agents, which are accessible to all users. You can see ours at https://app.khoj.dev/agents.
|
||||
|
||||
![Demo](/img/agents_demo.gif)
|
||||
![Demo](https://assets.khoj.dev/agents_demo.gif)
|
||||
|
||||
## Creating an Agent (Self-Hosted)
|
||||
|
||||
|
|
7
documentation/docs/features/share.md
Normal file
7
documentation/docs/features/share.md
Normal file
|
@ -0,0 +1,7 @@
|
|||
# Shareable Chat
|
||||
|
||||
You can share any of your conversations by going to the three dot menu on the conversation and selecting 'Share'. This will create a **public** link that you can share with anyone. The link will open the conversation in the same state it was when you shared it, so your future messages will not be visible to the person you shared it with.
|
||||
|
||||
This means you can easily share a conversation with someone to show them how you solved a problem, or to get help with something you're working on.
|
||||
|
||||
![demo of sharing a conversation](https://assets.khoj.dev/shareable_conversations.gif)
|
|
@ -38,7 +38,7 @@ Welcome to the Khoj Docs! This is the best place to get setup and explore Khoj's
|
|||
- [Read these instructions](/get-started/setup) to self-host a private instance of Khoj
|
||||
|
||||
## At a Glance
|
||||
![demo_chat](/img/using_khoj_for_studying.gif)
|
||||
![demo_chat](https://assets.khoj.dev/using_khoj_for_studying.gif)
|
||||
|
||||
#### [Search](/features/search)
|
||||
- **Natural**: Use natural language queries to quickly find relevant notes and documents.
|
||||
|
|
|
@ -201,6 +201,11 @@ To disable HTTPS, set the `KHOJ_NO_HTTPS` environment variable to `True`. This c
|
|||
1. Go to http://localhost:42110/server/admin and login with your admin credentials.
|
||||
#### Configure Chat Model
|
||||
##### Configure OpenAI or a custom OpenAI-compatible proxy server
|
||||
|
||||
:::info[Ollama Integration]
|
||||
Using Ollama? See the [Ollama Integration](/miscellaneous/ollama) section for more custom setup instructions.
|
||||
:::
|
||||
|
||||
1. Go to the [OpenAI settings](http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/) in the server admin settings to add an OpenAI processor conversation config. This is where you set your API key and server API base URL. The API base URL is optional - it's only relevant if you're using another OpenAI-compatible proxy server.
|
||||
2. Go over to configure your [chat model options](http://localhost:42110/server/admin/database/chatmodeloptions/). Set the `chat-model` field to a supported chat model[^1] of your choice. For example, you can specify `gpt-4-turbo-preview` if you're using OpenAI.
|
||||
- Make sure to set the `model-type` field to `OpenAI`.
|
||||
|
|
33
documentation/docs/miscellaneous/ollama.md
Normal file
33
documentation/docs/miscellaneous/ollama.md
Normal file
|
@ -0,0 +1,33 @@
|
|||
# Ollama / Khoj
|
||||
|
||||
You can run your own open source models locally with Ollama and use them with Khoj.
|
||||
|
||||
:::info[Ollama Integration]
|
||||
This is only going to be helpful for self-hosted users. If you're using [Khoj Cloud](https://app.khoj.dev), you're limited to our first-party models.
|
||||
:::
|
||||
|
||||
Khoj supports any OpenAI-API compatible server, which includes [Ollama](http://ollama.ai/). Ollama allows you to start a local server with [several popular open-source LLMs](https://ollama.com/library) directly on your own computer. Combined with Khoj, you can chat with these LLMs and use them to search your notes and documents.
|
||||
|
||||
While Khoj also supports local-hosted LLMs downloaded from Hugging Face, the Ollama integration is particularly useful for its ease of setup and multi-model support, especially if you're already using Ollama.
|
||||
|
||||
## Setup
|
||||
|
||||
1. Setup Ollama: https://ollama.com/
|
||||
2. Start your preferred model with Ollama. For example,
|
||||
```bash
|
||||
ollama run llama3
|
||||
```
|
||||
3. Go to Khoj settings at [OpenAI Processor Conversation Config](http://localhost:42110/server/admin/database/openaiprocessorconversationconfig/)
|
||||
4. Create a new config.
|
||||
- Name: `ollama`
|
||||
- Api Key: `any string`
|
||||
- Api Base Url: `http://localhost:11434/v1/` (default for Ollama)
|
||||
5. Go to [Chat Model Options](http://localhost:42110/server/admin/database/chatmodeloptions/)
|
||||
6. Create a new config.
|
||||
- Name: `llama3` (replace with the name of your local model)
|
||||
- Model Type: `Openai`
|
||||
- Openai Config: `<the ollama config you created in step 4>`
|
||||
- Max prompt size: `1000` (replace with the max prompt size of your model)
|
||||
7. Go to [your config](http://localhost:42110/config) and select the model you just created in the chat model dropdown.
|
||||
|
||||
That's it! You should now be able to chat with your Ollama model from Khoj. If you want to add additional models running on Ollama, repeat step 6 for each model.
|
Loading…
Reference in a new issue