Commit graph

3812 commits

Author SHA1 Message Date
sabaimran
57b4f844b7 Fail app start if initalization fails 2024-09-30 17:30:06 -07:00
Debanjum Singh Solanky
04aef362e2 Default to using system clock to infer user timezone on js clients
Using system clock to infer user timezone on clients makes Khoj
more robust to provide location aware responses.

Previously only ip based location was used to infer timezone via API.
This didn't provide any decent fallback when calls to ipapi failed or
Khoj was being run in offline mode
2024-09-30 07:08:12 -07:00
Debanjum Singh Solanky
344f3c60ba Infer country from timezone when only tz received by chat API
Timezone is easier to infer using clients system clock. This can be
used to infer user country name, country code, even if ip based
location cannot be inferred.

This makes using location data to contextualize Khoj's responses more
robust. For example, online search results are retrieved for user's
country, even if call to ipapi.co for ip based location fails
2024-09-30 07:08:11 -07:00
Debanjum Singh Solanky
1fed842fcc Localize online search results to user country when location available
Get country code to server chat api from i.p location check on clients.
Use country code to get country specific online search results via Serper.dev API
2024-09-30 07:08:11 -07:00
Debanjum Singh Solanky
eb86f6fc42 Add __str__ func to LocationData class to dedupe location string gen
Previously the location string from location data was being generated
wherever it was being used.

By adding a __str__ representation to LocationData class, we can
dedupe and simplify the code to get the location string
2024-09-30 07:08:11 -07:00
sabaimran
1dfc89e79f Store conversation ID for new conversations as a string, not UUID 2024-09-29 18:07:08 -07:00
sabaimran
d92a349292 Improve image generation tool description 2024-09-29 16:20:25 -07:00
Debanjum Singh Solanky
d21a4e73a0 Update docs to use new variables to sync files, directories from khoj.el 2024-09-29 14:03:06 -07:00
Debanjum Singh Solanky
d66a0ccfaa Update client setup docs with instructions for self-hosting users
Resolves #808
2024-09-29 13:58:02 -07:00
Debanjum Singh Solanky
dd44933515 Release Khoj version 1.24.0 2024-09-29 04:56:11 -07:00
Debanjum Singh Solanky
1e8ce52d98 Reduce size of Khoj Docker images by removing layers and caches
- Align Dockerfile and prod.Dockerfile code
- Reduce Docker image size by 25% by reducing Docker layers and
  removing package caches
2024-09-29 04:06:35 -07:00
Debanjum Singh Solanky
9b10b3e7a1 Remove unused langchain openai server dependency 2024-09-29 04:06:35 -07:00
Debanjum Singh Solanky
e767b6eba3 Update Documentation with flags to enable GPU on Khoj pip install
- Use tabs for GPU/CPU type khoj being install on
- Update CMAKE flags to use to install Khoj with correct GPU support
  Previous flags used DLLAMA, this has been updated to use DGGML now
  in llama.cpp
2024-09-29 04:06:35 -07:00
sabaimran
63a2b5b3c4 Remove tools cache in dockerize.yml workflow 2024-09-29 00:27:37 -07:00
Debanjum Singh Solanky
936bc64b82 Render images to take full width of chat message div
Remove unnecessary "Inferred Query" heading prefix to image generation prompt
used by Khoj. The inferred query in chat message has a heading of it's
own, so avoid two headings for the image prompt
2024-09-28 23:45:56 -07:00
Debanjum Singh Solanky
4efa7d4464 Upgrade the Next.js web app package dependency 2024-09-28 23:45:56 -07:00
Debanjum Singh Solanky
b3cb417796 Fix spelling of Manage Context in Side Panel of Web App 2024-09-28 23:45:56 -07:00
sabaimran
676ff5fa69 Fix setting title on new conversations, add the action menu 2024-09-28 23:43:27 -07:00
Shantanu Sakpal
65d5e03f7f
Reduce tooltip popup delay duration for Create Agent button on Web app (#926)
The problem was the tool tip was visible on hover, but it was slow, so before the tool tip popped up, the user would click on the button and this stopped the tool tip from popping up.

So i reduced the popup delay to 10ms. now as soon as user hovers over the button, they will see that its a feature coming soon!
2024-09-28 23:01:40 -07:00
Shantanu Sakpal
be8de1a1bd
Only Auto Scroll when at Page Bottom and Add Button to Scroll to Page Bottom on Web App (#923)
Improve Scrolling on Chat page of Web app

- Details
  1. Only auto scroll Khoj's streamed response when scroll is near bottom of page
      Allows scrolling to other messages in conversation while Khoj is formulating and streaming its response
  2. Add button to scroll to bottom of the chat page
  3. Scroll to most recent conversation turn on conversation first load
      It's a better default to anchor to most recent conversation turn (i.e most recent user message)
  4. Smooth scroll when Khoj's chat response is streamed
      Previously the scroll would jitter during response streaming
  5. Anchor scroll position when fetch and render older messages in conversation
      Allow users to keep their scroll position when older messages are fetched from server and rendered

Resolves #758
2024-09-28 22:54:34 -07:00
sabaimran
06777e1660
Convert the default conversation id to a uuid, plus other fixes (#918)
* Update the conversation_id primary key field to be a uuid

- update associated API endpoints
- this is to improve the overall application health, by obfuscating some information about the internal database
- conversation_id type is now implicitly a string, rather than an int
- ensure automations are also migrated in place, such that the conversation_ids they're pointing to are now mapped to the new IDs

* Update client-side API calls to correctly query with a string field

* Allow modifying of conversation properties from the chat title

* Improve drag and drop file experience for chat input area

* Use a phosphor icon for the copy to clipboard experience for code snippets

* Update conversation_id parameter to be a str type

* If django_apscheduler is not in the environment, skip the migration script

* Fix create automation flow by storing conversation id as string

The new UUID used for conversation id can't be directly serialized.
Convert to string for serializing it for later execution

---------

Co-authored-by: Debanjum Singh Solanky <debanjum@gmail.com>
2024-09-24 14:12:50 -07:00
Debanjum Singh Solanky
0c936cecc0 Release Khoj version 1.23.3 2024-09-24 12:44:09 -07:00
Debanjum Singh Solanky
61c6e742d5 Truncate chat context to max tokens for offline, openai chat actors too 2024-09-24 12:42:32 -07:00
sabaimran
e306e6ca94 Fix file paths used for pypi wheel building 2024-09-22 12:42:08 -07:00
Debanjum
f00e0e6080
Improve Khoj First Run, Docker Setup and Documentation (#919)
## Improve
- Intelligently initialize a decent default set of chat model options
- Create non-interactive mode. Auto set default server configuration on first run via Docker

## Fix
- Make RapidOCR dependency optional as flaky requirements causing docker build failures
- Set default openai text to image model correctly during initialization

## Details
Improve initialization flow during first run to remove need to configure Khoj:

- Set Google, Anthropic Chat models too
  Previously only Offline, Openai chat models could be set during init

- Add multiple chat models for each LLM provider
  Interactively set a comma separated list of models for each provider

- Auto add default chat models for each provider in non-interactive
  model if the `{OPENAI,GEMINI,ANTHROPIC}_API_KEY' env var is set
  - Used when server run via Docker as user input cannot be processed to configure server during first run

- Do not ask for `max_tokens', `tokenizer' for offline models during
  initialization. Use better defaults inferred in code instead

- Explicitly set default chat model to use
  If unset, it implicitly defaults to using the first chat model.
  Make it explicit to reduce this confusion

Resolves #882
2024-09-21 14:15:45 -07:00
Debanjum Singh Solanky
a6c0b43539 Upgrade documentation package dependencies 2024-09-21 14:06:40 -07:00
Debanjum Singh Solanky
2033f5168e Modularize chat models initialization with a reusable function
The chat model initialize interaction flow is fairly similar across
the chat model providers.

This should simplify adding new chat model providers and reduce
chances of bugs in the interactive chat model initialization flow.
2024-09-21 14:06:40 -07:00
Debanjum Singh Solanky
26c39576df Add Documentation for the settings on the Khoj Admin Panel
This is an initial pass to add documentation for all the knobs
available on the Khoj Admin panel.

It should shed some light onto what each admin setting is for and how
they can be customized when self hosting.

Resolves #831
2024-09-21 14:06:40 -07:00
Debanjum Singh Solanky
730e5608bb Improve Self Hosting Docs. Better Docker, Remote Access Setup Instructions
- Improve Self Hosting Docker Instructions
  - Ask to Install Docker Desktop to not require separate
    docker-compose install and unify the instruction across OS
  - To Self Host on Windows, ask to use Docker Desktop with WSL2 backend

- Use nested Tab grouping to split Docker vs Pip Self Host Instructions

- Reduce Self Host Setup Steps in Documentation after code simplification
  - First run now avoids need to configure Khoj via admin panel
  - So move the chat model config steps into optional post setup
    config section

- Improve Instructions to Configure chat models on First Run

- Compress configuring chat model providers into a Tab Group

- Add Documentation for Remote Access under Advanced Self Hosting
2024-09-21 14:06:17 -07:00
Debanjum Singh Solanky
91c76d4152 Intelligently initialize a decent default set of chat model options
Given the LLM landscape is rapidly changing, providing a good default
set of options should help reduce decision fatigue to get started

Improve initialization flow during first run
- Set Google, Anthropic Chat models too
  Previously only Offline, Openai chat models could be set during init

- Add multiple chat models for each LLM provider
  Interactively set a comma separated list of models for each provider

- Auto add default chat models for each provider in non-interactive
  model if the {OPENAI,GEMINI,ANTHROPIC}_API_KEY env var is set

- Do not ask for max_tokens, tokenizer for offline models during
  initialization. Use better defaults inferred in code instead

- Explicitly set default chat model to use
  If unset, it implicitly defaults to using the first chat model.
  Make it explicit to reduce this confusion

Resolves #882
2024-09-19 20:32:08 -07:00
Debanjum Singh Solanky
f177723711 Add default server configuration on first run in non-interactive mode
This should configure Khoj with decent default configurations via
Docker and avoid needing to configure Khoj via admin page to start
using dockerized Khoj

Update default max prompt size set during khoj initialization
as online chat model are cheaper and offline chat models have larger
context now
2024-09-19 15:12:55 -07:00
Debanjum Singh Solanky
020167c7cf Set default openai text to image model correctly during initialization
Speech to text model was previously being set to the text to image
model previously!
2024-09-19 15:11:34 -07:00
Debanjum Singh Solanky
077b88bafa Make RapidOCR dependency optional as flaky requirements
RapidOCR depends on OpenCV which by default requires a bunch of GUI
paramters. This system package dependency set (like libgl1) is flaky

Making the RapidOCR dependency optional should allow khoj to be more
resilient to setup/dependency failures

Trade-off is that OCR for documents may not always be available and
it'll require looking at server logs to find out when this happens
2024-09-19 15:10:31 -07:00
sabaimran
0a568244fd Revert "Convert conversationId int to string before making api request to bulk update file filters"
This reverts commit c9665fb20b.

Revert "Fix handling for new conversation in agents page"

This reverts commit 3466f04992.

Revert "Add a unique_id field for identifiying conversations (#914)"

This reverts commit ece2ec2d90.
2024-09-18 20:36:57 -07:00
Debanjum Singh Solanky
bb2bd77a64 Send chat message to Khoj web app via url query param
- This allows triggering khoj chat from the browser addressbar
- So now if you add Khoj to your browser bookmark with
  - URL: https://app.khoj.dev/?q=%s
  - Keyword: khoj

- Then you can type "khoj what is the news today" to trigger Khoj to
  quickly respond to your query. This avoids having to open the Khoj web
  app before asking your question
2024-09-17 21:50:47 -07:00
Debanjum Singh Solanky
ecdbcd815e Simplify code to remove json codeblock from AI response string 2024-09-17 21:50:47 -07:00
sabaimran
e457720e8a Improve the email templates and better align with new branding 2024-09-17 11:18:25 -07:00
sabaimran
c9665fb20b Convert conversationId int to string before making api request to bulk update file filters 2024-09-16 15:45:23 -07:00
sabaimran
3466f04992 Fix handling for new conversation in agents page 2024-09-16 15:04:49 -07:00
sabaimran
ece2ec2d90
Add a unique_id field for identifiying conversations (#914)
* Add a unique_id field to the conversation object

- This helps us keep track of the unique identity of the conversation without expose the internal id
- Create three staged migrations in order to first add the field, then add unique values to pre-fill, and then set the unique constraint. Without this, it tries to initialize all the existing conversations with the same ID.

* Parse and utilize the unique_id field in the query parameters of the front-end view

- Handle the unique_id field when creating a new conversation from the home page
- Parse the id field with a lightweight parameter called v in the chat page
- Share page should not be affected, as it uses the public slug

* Fix suggested card category
2024-09-16 12:19:16 -07:00
sabaimran
e6bc7a2ba2 Fix links to log in email templates 2024-09-15 19:14:19 -07:00
Debanjum Singh Solanky
79980feb7b Release Khoj version 1.23.2 2024-09-15 03:07:26 -07:00
Debanjum Singh Solanky
575ff103cf Frame chat response error on web app in a more conversational form
Also indicate hitting dislike on the message should be enough to
convey the issue to the developers.
2024-09-15 03:00:49 -07:00
Debanjum Singh Solanky
893ae60a6a Improve handling of harmful categorized responses by Gemini
Previously Khoj would stop in the middle of response generation when
the safety filters got triggered at default thresholds. This was
confusing as it felt like a service error, not expected behavior.

Going forward Khoj will
- Only block responding to high confidence harmful content detected by
  Gemini's safety filters instead of using the default safety settings
- Show an explanatory, conversational response (w/ harm category)
  when response is terminated due to Gemini's safety filters
2024-09-15 02:17:54 -07:00
sabaimran
ec1f87a896 Release Khoj version 1.23.1 2024-09-12 22:46:39 -07:00
sabaimran
2a4416d223 Use prefetch_related for the openai_config when retrieving all chatmodeloptions async 2024-09-12 22:45:43 -07:00
sabaimran
253ca92203 Release Khoj version 1.23.0 2024-09-12 20:25:29 -07:00
Debanjum Singh Solanky
178b78f87b Show debug log, not warning when use default tokenizer for context stuffing 2024-09-12 20:21:01 -07:00
Debanjum
f173188dcf
Support using image generation models like Flux via Replicate (#909)
- Support using image generation models like Flux via Replicate
- Modularize the image generation code
- Make generate better image prompt chat actor add composition details
- Generate vivid images with DALLE-3
2024-09-12 20:19:46 -07:00
Debanjum Singh Solanky
75d3b34452 Extract image generation code into new image processor for modularity 2024-09-12 20:01:32 -07:00