Commit graph

10 commits

Author SHA1 Message Date
timothycarambat
64b3210db2 build our own worker fanout and wrapper 2025-02-14 10:31:06 -08:00
timothycarambat
2f89bbae74 OCR PDFs as fallback in spawn thread 2025-02-13 16:06:47 -08:00
Sean Hatfield
48dcb22b25
Dynamic fetching of TogetherAI models ()
* implement dynamic fetching of togetherai models

* implement caching for togetherai models

* update gitignore for togetherai model caching

* Remove models.json from git tracking

* Remove .cached_at from git tracking

* lint

* revert unneeded change

---------

Co-authored-by: Timothy Carambat <rambat1010@gmail.com>
2025-01-24 11:06:59 -08:00
Timothy Carambat
21af81085a
Add caching to Gemini /models ()
rename file typo
2025-01-13 13:12:03 -08:00
Timothy Carambat
ad01df8790
Reranker option for RAG ()
* Reranker WIP

* add cacheing and singleton loading

* Add field to workspaces for vectorSearchMode
Add UI for lancedb to change mode
update all search endpoints to pass in reranker prop if provider can use it

* update hint text

* When reranking, swap score to rerank score

* update optchain
2025-01-02 14:27:52 -08:00
Timothy Carambat
80565d79e0
2488 novita ai llm integration ()
* feat: add new model provider: Novita AI

* feat: finished novita AI

* fix: code lint

* remove unneeded logging

* add back log for novita stream not self closing

* Clarify ENV vars for LLM/embedder seperation for future
Patch ENV check for workspace/agent provider

---------

Co-authored-by: Jason <ggbbddjm@gmail.com>
Co-authored-by: shatfield4 <seanhatfield5@gmail.com>
2024-11-04 11:34:29 -08:00
Timothy Carambat
bce7988683
Integrate Apipie support directly ()
resolves 
resolves 
Note: Streaming not supported
2024-10-15 12:36:06 -07:00
Timothy Carambat
ac6ca13f60
1173 dynamic cache openrouter ()
* patch agent invocation rule

* Add dynamic model cache from OpenRouter API for context length and available models
2024-04-23 11:10:54 -07:00
Timothy Carambat
1e98da07bc
docs: placeholder for model downloads folder () 2023-12-14 10:31:14 -08:00
Timothy Carambat
88cdd8c872
Add built-in embedding engine into AnythingLLM ()
* Implement use of native embedder (all-Mini-L6-v2)
stop showing prisma queries during dev

* Add native embedder as an available embedder selection

* wrap model loader in try/catch

* print progress on download

* Update to progress output for embedder

* move embedder selection options to component

* forgot import

* add Data privacy alert updates for local embedder
2023-12-06 10:36:22 -08:00