mirror of
https://github.com/khoj-ai/khoj.git
synced 2024-11-23 15:38:55 +01:00
Mirror of khoj from Github
agentaiassistantchatchatgptemacsimage-generationllama3llamacppllmobsidianobsidian-mdoffline-llmproductivityragresearchself-hostedsemantic-searchsttwhatsapp-ai
1c920273dd
## Overview Use git to capture prompt traces of khoj's train of thought. View, analyze and debug them using your favorite git client (e.g vscode, magit). - Each commit captures an interaction with an LLM The commit writes the query, response and system message each to a separate file in the repo. The commit message captures the chat model, Khoj version and other metadata - Each conversation turn can have multiple interactions with an LLM (e.g Khoj's train of thought) - Each new conversation turn forks from and merges back into its conversation branch - Each new conversation branches from the user branch - Each new user branches from root commit on the main branch ## Usage 1. Set `KHOJ_DEBUG=true` or start khoj in very verbose mode with `khoj -vv` to turn on prompt tracing 2. Chat with Khoj as usual 3. Open the promptrace git repo to view the generated prompt traces using your favorite git porcelain. The Khoj prompt trace git repo is created at `/tmp/khoj_promptrace` by default. You can configure the prompt trace directory by setting the `PROMPTRACE_DIR`environment variable. ## Implementation - Add utility functions to capture prompt traces using git (via `gitpython`) - Make each model provider in Khoj commit their LLM interactions with promptrace - Weave chat metadata from chat API through all chat actors and commit it to the prompt trace |
||
---|---|---|
.github | ||
documentation | ||
scripts | ||
src | ||
tests | ||
.dockerignore | ||
.gitattributes | ||
.gitignore | ||
.pre-commit-config.yaml | ||
docker-compose.yml | ||
Dockerfile | ||
gunicorn-config.py | ||
LICENSE | ||
manifest.json | ||
prod.Dockerfile | ||
pyproject.toml | ||
pytest.ini | ||
README.md | ||
versions.json |
Your AI second brain
Khoj is a personal AI app to extend your capabilities. It smoothly scales up from an on-device personal AI to a cloud-scale enterprise AI.
- Chat with any local or online LLM (e.g llama3, qwen, gemma, mistral, gpt, claude, gemini).
- Get answers from the internet and your docs (including image, pdf, markdown, org-mode, word, notion files).
- Access it from your Browser, Obsidian, Emacs, Desktop, Phone or Whatsapp.
- Create agents with custom knowledge, persona, chat model and tools to take on any role.
- Automate away repetitive research. Get personal newsletters and smart notifications delivered to your inbox.
- Find relevant docs quickly and easily using our advanced semantic search.
- Generate images, talk out loud, play your messages.
- Khoj is open-source, self-hostable. Always.
- Run it privately on your computer or try it on our cloud app.
See it in action
Go to https://app.khoj.dev to see Khoj live.
Full feature list
You can see the full feature list here.
Self-Host
To get started with self-hosting Khoj, read the docs.
Contributors
Cheers to our awesome contributors! 🎉
Made with contrib.rocks.
Interested in Contributing?
We are always looking for contributors to help us build new features, improve the project documentation, or fix bugs. If you're interested, please see our Contributing Guidelines and check out our Contributors Project Board.