* feat: add new model provider PPIO
* fix: fix ppio model fetching
* fix: code lint
* reorder LLM
update interface for streaming and chats to use valid keys
linting
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
* feat: add new model provider: Novita AI
* feat: finished novita AI
* fix: code lint
* remove unneeded logging
* add back log for novita stream not self closing
* Clarify ENV vars for LLM/embedder seperation for future
Patch ENV check for workspace/agent provider
---------
Co-authored-by: Jason <ggbbddjm@gmail.com>
Co-authored-by: shatfield4 <seanhatfield5@gmail.com>
* Issue #1943: Add support for LLM provider - Fireworks AI
* Update UI selection boxes
Update base AI keys for future embedder support if needed
Add agent capabilites for FireworksAI
* class only return
---------
Co-authored-by: Aaron Van Doren <vandoren96+1@gmail.com>
* initial commit for chrome extension
* wip browser extension backend
* wip frontend browser extension settings
* fix typo for browserExtension route
* implement verification codes + frontend panel for browser extension keys
* reorganize + state management for all connection states
* implement embed to workspace
* add send page to anythingllm extension option + refactor
* refactor connection string auth + update context menus + organize background.js into models
* popup extension from main app and save if successful
* fix hebrew translation misspelling
* fetch custom logo inside chrome extension
* delete api keys on disconnect of extension
* use correct apiUrl constant in frontend + remove unneeded comments
* remove upload-link endpoint and send inner text html to raw text collector endpoint
* update readme
* fix readme link
* fix readme typo
* update readme
* handle deletion of browser keys with key id and DELETE endpoint
* move event string to constant
* remove tablename and writable fields from BrowserExtensionApiKey backend model
* add border-none to all buttons and inputs for desktop compatibility
* patch prisma injections
* update delete endpoints to delete keys by id
* remove unused prop
* add button to attempt browser extension connection + remove max active keys
* wip multi user mode support
* multi user mode support
* clean up backend + show created by in frotend browser extension page
* show multi user warning message on key creation + hide context menus when no workspaces
* show browser extension options to managers
* small backend changes and refactors
* extension cleanup
* rename submodule
* extension updates & docs
* dev docker build
---------
Co-authored-by: shatfield4 <seanhatfield5@gmail.com>
* add text gen web ui LLM provider support
* update README
* README typo
* update TextWebUI display name
patch workspace<>model support for provider
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
[ 📝 ] Added new LLMs to supported LLMs list on README
- Added KoboldCPP to supported LLMs list
- Added Cohere to supported LLMs list
- Added Generic OpenAI to supported LLMs list
- Added Cohere to supported Embedding models list
* WIP openrouter integration
* add OpenRouter options to onboarding flow and data handling
* add todo to fix headers for rankings
* OpenRouter LLM support complete
* Fix hanging response stream with OpenRouter
update tagline
update comment
* update timeout comment
* wait for first chunk to start timer
* sort OpenRouter models by organization
* uppercase first letter of organization
* sort grouped models by org
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
* add LLM support for perplexity
* update README & example env
* fix ENV keys in example env files
* slight changes for QA of perplexity support
* Update Perplexity AI name
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>