anything-llm/frontend
Sean Hatfield 90df37582b
Per workspace model selection ()
* WIP model selection per workspace (migrations and openai saves properly

* revert OpenAiOption

* add support for models per workspace for anthropic, localAi, ollama, openAi, and togetherAi

* remove unneeded comments

* update logic for when LLMProvider is reset, reset Ai provider files with master

* remove frontend/api reset of workspace chat and move logic to updateENV
add postUpdate callbacks to envs

* set preferred model for chat on class instantiation

* remove extra param

* linting

* remove unused var

* refactor chat model selection on workspace

* linting

* add fallback for base path to localai models

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-17 12:59:25 -08:00
..
public Robots.txt () 2023-11-13 15:22:24 -08:00
src Per workspace model selection () 2024-01-17 12:59:25 -08:00
.env.example devcontainer v1 () 2024-01-08 15:31:06 -08:00
.gitignore devcontainer v1 () 2024-01-08 15:31:06 -08:00
.nvmrc bump node version requirement 2023-06-08 10:29:17 -07:00
index.html add feedback form, hosting link, update readme, show promo image 2023-08-11 17:28:30 -07:00
jsconfig.json chore: add @ as alias for frontend root () 2023-12-07 09:09:01 -08:00
package.json Patch minor XSS opportunity where user can self-XSS themselves. () 2024-01-11 09:57:59 -08:00
postcss.config.js inital commit 2023-06-03 19:28:07 -07:00
tailwind.config.js devcontainer v1 () 2024-01-08 15:31:06 -08:00
vite.config.js devcontainer v1 () 2024-01-08 15:31:06 -08:00
yarn.lock Patch minor XSS opportunity where user can self-XSS themselves. () 2024-01-11 09:57:59 -08:00