Merge pull request #196 from debanjum/create-chat-modal-for-obsidian
- Set your OpenAI API key in the Khoj Obsidian Settings
- Use Modal in Obsidian for Chat
- Style Chat Modal combining the Khoj Web interface and Obsidian theme style
- Give space in the input field. Too narrow previously
- References should be indexed from 1 instead of 0
- Use Obsidian font size variables to scale fonts in chat appropriately
- Add message sender, date metadata as message footer
- Use css directly from Khoj Chat Web Interface.
- Modify it to work under a Obsidian modal
- So replace html, body styling from web interface to instead
styling new "khoj-chat" class attached to contentEl of modal
Merge pull request #193 from debanjum/simplify-khoj-server-setup-on-emacs
## Major Changes
- ae535a0 Configure Khoj chat using khoj.el by setting OpenAI API key in Emacs
- 82eb4bf Setup Khoj server on opening khoj.el
- 99d19dc Start Khoj server from Emacs using khoj.el
- c92d791 Install Khoj server from Emacs using khoj.el
*This assumes you have python (<3.11) and pip installed in a system path*
### Sample Config
- Enable Khoj Chat by configuring you OpenAI API Key
- Specify Org Files, Directories to Index for Search (and Chat)
By default, your org-agenda-files (include archive files)) are indexed
- Invoke khoj by calling `C-c s`
``` emacs-lisp
(use-package khoj
:after org
:straight (khoj
:type git
:host github
:repo "debanjum/khoj"
:files ("src/interface/emacs/khoj.el"))
:bind ("C-c s" . 'khoj)
:config (setq
khoj-openai-api-key "<YOUR_OPENAI_API_KEY_FOR_KHOJ_CHAT>"
khoj-org-directories '("~/docs/notes" "~/docs/journals")
khoj-org-files '("~/docs/tasks.org" "~/docs/journal.org" "~/docs/archive.org")))
```
Converts paths to glob style regexes that will index all org files
recursively under the specified list of path
Should help setup for org-roam users from khoj.el
- khoj-auto-setup controls whether to automatically check for and
setup khoj server from within Emacs
- extract install, start, configure sequence into public, interactive
method. Allows calling khoj-setup during package load via init.el
- Fix: Do not attempt to configure or wait for server ready if
user has said no to auto-setup request
- Fix logic to mark server started vs ready
- Previously the started/running vs ready variables defs were getting
intertwined
- Server started indicates server bootup has been triggered
- Server ready indicates server API ready to accept requests
- If khoj server started outside emacs, khoj--server-ready should be set
to true by khoj--server-running method (instead of waiting for proc msg)
- If khoj server is unconfigured the /config/types endpoint wouldn't
return anything. Using config/data/default allows checking khoj server
running status without requiring it to be configured as well
If the config hasn't changed there'll be no update. If config has
changed indexing will get triggered asynchronously. But user cannot
make query till indexing done
As easier to know when server ready to configure
- Use process filter, sentinel to mark when khoj server is ready or not
- Display server messages for visibility into server boot-up process
- Wait until server ready to open khoj transient menu in Emacs
Until then khoj features wouldn't work anyway, so avoids confusion
- Move completion and chat_completion into helper methods under utils.py
- Add retry with exponential backoff on OpenAI exceptions using
tenacity package. This is officially suggested and used by other
popular GPT based libraries
Merge pull request #192 from debanjum/improvements-to-khoj-chat-in-emacs
### Khoj Chat on Emacs Improvements
- d78454d Load Khoj Chat buffer before asking for query to provide context
- 93e2aff Use org footnotes to add references, allows jump to def on click
- 5e9558d Stylize reference links as superscripts and show definition on hover
- bc71c19 Use `m` or `C-x m` in-buffer keybindings to send messages to Khoj
### Khoj Chat Server Improvements
- 27217a3 Time chat API sub-components for performance analysis
- 508b217 Update Chat API, Logs, Interfaces to store, use references as list
- d4b3866 Truncate message logs to below max supported prompt size by chat model
- cf28f10 Register separate timestamps for user query and response by Khoj Chat
- Use tiktoken to count tokens for chat models
- Make conversation turns to add to prompt configurable via method
argument to generate_chatml_messages_with_context method
- Remove the need to split by magic string in emacs and chat interfaces
- Move compiling references into string as context for GPT to GPT layer
- Update setup in tests to use new style of setting references
- Name first argument to converse as more appropriate "references"
- Render references as superscript
- Show reference definitions on hover over reference links to ease access
- Truncate reference def shown on hover to 70 char
- Add continuation suffix, ..., when reference definition truncated