anything-llm/server/utils/AiProviders
Sean Hatfield 0634013788
[FEAT] Groq LLM support ()
* Groq LLM support complete

* update useGetProvidersModels for groq models

* Add definiations
update comments and error log reports
add example envs

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-03-06 14:48:38 -08:00
..
anthropic add support for mistral api () 2024-01-17 14:42:05 -08:00
azureOpenAi Refactor LLM chat backend () 2024-02-14 12:32:07 -08:00
gemini Refactor LLM chat backend () 2024-02-14 12:32:07 -08:00
groq [FEAT] Groq LLM support () 2024-03-06 14:48:38 -08:00
huggingface Refactor LLM chat backend () 2024-02-14 12:32:07 -08:00
lmStudio Refactor LLM chat backend () 2024-02-14 12:32:07 -08:00
localAi Refactor LLM chat backend () 2024-02-14 12:32:07 -08:00
mistral Refactor LLM chat backend () 2024-02-14 12:32:07 -08:00
native Refactor LLM chat backend () 2024-02-14 12:32:07 -08:00
ollama [DOCS] Update Docker documentation to show how to setup Ollama with Dockerized version of AnythingLLM () 2024-02-21 18:42:32 -08:00
openAi Enable ability to do full-text query on documents () 2024-02-21 13:15:45 -08:00
openRouter [FEAT] OpenRouter integration () 2024-02-23 17:18:58 -08:00
perplexity CHORE: bump pplx model support () 2024-02-23 17:33:16 -08:00
togetherAi Refactor LLM chat backend () 2024-02-14 12:32:07 -08:00