mirror of
https://github.com/Mintplex-Labs/anything-llm.git
synced 2025-03-27 08:04:43 +00:00
* exposes `maxConcurrentChunks` parameter for the generic openai embedder through configuration. This allows setting a batch size for endpoints which don't support the default of 500 * Update new field to new UI make getting to ensure proper type and format --------- Co-authored-by: timothycarambat <rambat1010@gmail.com> |
||
---|---|---|
.. | ||
index.js |