anything-llm/server/utils/EmbeddingEngines/genericOpenAi
hdelossantos 304796ec59
feat: support setting maxConcurrentChunks for Generic OpenAI embedder ()
* exposes `maxConcurrentChunks` parameter for the generic openai embedder through configuration. This allows setting a batch size for endpoints which don't support the default of 500

* Update new field to new UI
make getting to ensure proper type and format

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-11-21 11:29:44 -08:00
..
index.js feat: support setting maxConcurrentChunks for Generic OpenAI embedder () 2024-11-21 11:29:44 -08:00