Commit graph

16 commits

Author SHA1 Message Date
Sean Hatfield
55fc9cd6b1
TogetherAI Llama 3.2 vision models support ()
* togetherai llama 3.2 vision models support

* remove console log

* fix listing to reflect what is on the chart

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-11-21 10:42:42 -08:00
Sean Hatfield
e29f054706
Bump TogetherAI models ()
* bump together ai models

* Run post-bump command

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-11-18 13:08:26 -08:00
Timothy Carambat
99f2c25b1c
Agent Context window + context window refactor. ()
* Enable agent context windows to be accurate per provider:model

* Refactor model mapping to external file
Add token count to document length instead of char-count
refernce promptWindowLimit from AIProvider in central location

* remove unused imports
2024-08-15 12:13:28 -07:00
timothycarambat
466bf7dc9c Bump Perplexity and Together AI static model list 2024-07-31 10:58:34 -07:00
Timothy Carambat
0b845fbb1c
Deprecate .isSafe moderation ()
Add type defs to helpers
2024-06-28 15:32:30 -07:00
Timothy Carambat
01cf2fed17
Make native embedder the fallback for all LLMs () 2024-05-16 17:25:05 -07:00
Sean Hatfield
9feaad79cc
[CHORE] Remove sendChat and streamChat in all LLM providers ()
* remove sendChat and streamChat functions/references in all LLM providers

* remove unused imports

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-05-01 16:52:28 -07:00
Timothy Carambat
547d4859ef
Bump openai package to latest ()
* Bump `openai` package to latest
Tested all except localai

* bump LocalAI support with latest image

* add deprecation notice

* linting
2024-04-30 12:33:42 -07:00
timothycarambat
e28c0469f4 bump togetherai models Apr 18, 2024
resolves 
2024-04-18 16:28:43 -07:00
Timothy Carambat
8306098b08
Bump all static model providers () 2024-04-14 12:55:21 -07:00
Timothy Carambat
0e46a11cb6
Stop generation button during stream-response ()
* Stop generation button during stream-response

* add custom stop icon

* add stop to thread chats
2024-03-12 15:21:27 -07:00
Timothy Carambat
c59ab9da0a
Refactor LLM chat backend ()
* refactor stream/chat/embed-stram to be a single execution logic path so that it is easier to maintain and build upon

* no thread in sync chat since only api uses it
adjust import locations
2024-02-14 12:32:07 -08:00
Timothy Carambat
aca5940650
Refactor handleStream to LLM Classes () 2024-02-07 08:15:14 -08:00
Sean Hatfield
c2c8fe9756
add support for mistral api ()
* add support for mistral api

* update docs to show support for Mistral

* add default temp to all providers, suggest different results per provider

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-17 14:42:05 -08:00
Sean Hatfield
90df37582b
Per workspace model selection ()
* WIP model selection per workspace (migrations and openai saves properly

* revert OpenAiOption

* add support for models per workspace for anthropic, localAi, ollama, openAi, and togetherAi

* remove unneeded comments

* update logic for when LLMProvider is reset, reset Ai provider files with master

* remove frontend/api reset of workspace chat and move logic to updateENV
add postUpdate callbacks to envs

* set preferred model for chat on class instantiation

* remove extra param

* linting

* remove unused var

* refactor chat model selection on workspace

* linting

* add fallback for base path to localai models

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-17 12:59:25 -08:00
Sean Hatfield
1d39b8a2ce
add Together AI LLM support ()
* add Together AI LLM support

* update readme to support together ai

* Patch togetherAI implementation

* add model sorting/option labels by organization for model selection

* linting + add data handling for TogetherAI

* change truthy statement
patch validLLMSelection method

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-10 12:35:30 -08:00