mirror of
https://github.com/Mintplex-Labs/anything-llm.git
synced 2025-03-30 01:16:26 +00:00
* exposes `maxConcurrentChunks` parameter for the generic openai embedder through configuration. This allows setting a batch size for endpoints which don't support the default of 500 * Update new field to new UI make getting to ensure proper type and format --------- Co-authored-by: timothycarambat <rambat1010@gmail.com> |
||
---|---|---|
.. | ||
admin | ||
chat | ||
camelcase.js | ||
customModels.js | ||
index.js | ||
portAvailabilityChecker.js | ||
tiktoken.js | ||
updateENV.js |