mirror of
https://github.com/khoj-ai/khoj.git
synced 2024-12-17 18:17:10 +00:00
91c76d4152
Given the LLM landscape is rapidly changing, providing a good default set of options should help reduce decision fatigue to get started Improve initialization flow during first run - Set Google, Anthropic Chat models too Previously only Offline, Openai chat models could be set during init - Add multiple chat models for each LLM provider Interactively set a comma separated list of models for each provider - Auto add default chat models for each provider in non-interactive model if the {OPENAI,GEMINI,ANTHROPIC}_API_KEY env var is set - Do not ask for max_tokens, tokenizer for offline models during initialization. Use better defaults inferred in code instead - Explicitly set default chat model to use If unset, it implicitly defaults to using the first chat model. Make it explicit to reduce this confusion Resolves #882 |
||
---|---|---|
.. | ||
data | ||
__init__.py | ||
conftest.py | ||
helpers.py | ||
test_cli.py | ||
test_client.py | ||
test_conversation_utils.py | ||
test_date_filter.py | ||
test_db_lock.py | ||
test_docx_to_entries.py | ||
test_file_filter.py | ||
test_helpers.py | ||
test_image_to_entries.py | ||
test_markdown_to_entries.py | ||
test_multiple_users.py | ||
test_offline_chat_actors.py | ||
test_offline_chat_director.py | ||
test_openai_chat_actors.py | ||
test_openai_chat_director.py | ||
test_org_to_entries.py | ||
test_orgnode.py | ||
test_pdf_to_entries.py | ||
test_plaintext_to_entries.py | ||
test_text_search.py | ||
test_word_filter.py |