anything-llm/docker
hdelossantos 304796ec59
feat: support setting maxConcurrentChunks for Generic OpenAI embedder ()
* exposes `maxConcurrentChunks` parameter for the generic openai embedder through configuration. This allows setting a batch size for endpoints which don't support the default of 500

* Update new field to new UI
make getting to ensure proper type and format

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-11-21 11:29:44 -08:00
..
vex Add known VEX files to build process () 2024-07-25 11:13:57 -07:00
.env.example feat: support setting maxConcurrentChunks for Generic OpenAI embedder () 2024-11-21 11:29:44 -08:00
docker-compose.yml Docker build frontend layer improvements () 2024-07-19 15:01:16 -07:00
docker-entrypoint.sh Update Ubuntu base image and improve Dockerfile () 2024-03-06 16:34:45 -08:00
docker-healthcheck.sh Update Ubuntu base image and improve Dockerfile () 2024-03-06 16:34:45 -08:00
Dockerfile fix(Dockerfile): remove hardcoded exposed port () 2024-08-13 09:16:03 -07:00
HOW_TO_USE_DOCKER.md Feature/add searchapi web browsing () 2024-09-05 10:36:46 -07:00