2023-06-13 13:19:17 -07:00
# How to use Dockerized Anything LLM
Use the Dockerized version of AnythingLLM for a much faster and complete startup of AnythingLLM.
2023-12-19 13:24:50 -08:00
### Minimum Requirements
2024-02-21 18:42:32 -08:00
2023-12-19 13:24:50 -08:00
> [!TIP]
2024-02-21 18:42:32 -08:00
> Running AnythingLLM on AWS/GCP/Azure?
2023-12-19 13:24:50 -08:00
> You should aim for at least 2GB of RAM. Disk storage is proportional to however much data
> you will be storing (documents, vectors, models, etc). Minimum 10GB recommended.
- `docker` installed on your machine
- `yarn` and `node` on your machine
- access to an LLM running locally or remotely
2024-02-21 18:42:32 -08:00
\*AnythingLLM by default uses a built-in vector database powered by [LanceDB ](https://github.com/lancedb/lancedb )
2023-12-19 13:24:50 -08:00
2024-02-21 18:42:32 -08:00
\*AnythingLLM by default embeds text on instance privately [Learn More ](../server/storage/models/README.md )
2023-06-13 13:19:17 -07:00
2023-12-06 11:27:38 -08:00
## Recommend way to run dockerized AnythingLLM!
2024-02-21 18:42:32 -08:00
2023-12-13 11:59:14 -08:00
> [!IMPORTANT]
> If you are running another service on localhost like Chroma, LocalAi, or LMStudio
> you will need to use http://host.docker.internal:xxxx to access the service from within
> the docker container using AnythingLLM as `localhost:xxxx` will not resolve for the host system.
2024-02-19 10:29:47 -08:00
>
> **Requires** Docker v18.03+ on Win/Mac and 20.10+ on Linux/Ubuntu for host.docker.internal to resolve!
>
> _Linux_: add `--add-host=host.docker.internal:host-gateway` to docker run command for this to resolve.
>
2023-12-13 11:59:14 -08:00
> eg: Chroma host URL running on localhost:8000 on host machine needs to be http://host.docker.internal:8000
> when used in AnythingLLM.
2023-12-06 11:27:38 -08:00
> [!TIP]
> It is best to mount the containers storage volume to a folder on your host machine
> so that you can pull in future updates without deleting your existing data!
2024-01-08 16:15:01 -08:00
Pull in the latest image from docker. Supports both `amd64` and `arm64` CPU architectures.
2024-02-21 18:42:32 -08:00
2024-01-08 16:15:01 -08:00
```shell
docker pull mintplexlabs/anythingllm
```
2023-11-17 20:15:11 -08:00
2023-12-20 11:33:00 -08:00
< table >
< tr >
2024-01-08 16:15:01 -08:00
< th colspan = "2" > Mount the storage locally and run AnythingLLM in Docker< / th >
2023-12-20 11:33:00 -08:00
< / tr >
< tr >
2024-01-08 16:15:01 -08:00
< td >
Linux/MacOs
< / td >
2023-12-20 11:33:00 -08:00
< td >
2023-12-06 11:27:38 -08:00
```shell
2023-12-17 15:48:56 -08:00
export STORAGE_LOCATION=$HOME/anythingllm & & \
2023-12-07 15:11:48 -08:00
mkdir -p $STORAGE_LOCATION & & \
2023-12-06 11:27:38 -08:00
touch "$STORAGE_LOCATION/.env" & & \
docker run -d -p 3001:3001 \
2023-12-14 15:14:56 -08:00
--cap-add SYS_ADMIN \
2023-12-06 11:27:38 -08:00
-v ${STORAGE_LOCATION}:/app/server/storage \
-v ${STORAGE_LOCATION}/.env:/app/server/.env \
-e STORAGE_DIR="/app/server/storage" \
2024-01-08 16:15:01 -08:00
mintplexlabs/anythingllm
2023-12-06 11:27:38 -08:00
```
2024-01-08 16:15:01 -08:00
< / td >
< / tr >
< tr >
< td >
Windows
2023-12-20 11:33:00 -08:00
< / td >
< td >
```powershell
2024-01-08 16:15:01 -08:00
# Run this in powershell terminal
2023-12-20 11:40:04 -08:00
$env:STORAGE_LOCATION="$HOME\Documents\anythingllm"; `
If(!(Test-Path $env:STORAGE_LOCATION)) {New-Item $env:STORAGE_LOCATION -ItemType Directory}; `
2024-03-08 19:40:27 -05:00
If(!(Test-Path "$env:STORAGE_LOCATION\.env")) {New-Item "$env:STORAGE_LOCATION\.env" -ItemType File}; `
2023-12-20 11:33:00 -08:00
docker run -d -p 3001:3001 `
--cap-add SYS_ADMIN `
-v "$env:STORAGE_LOCATION`:/app/server/storage" `
-v "$env:STORAGE_LOCATION\.env:/app/server/.env" `
-e STORAGE_DIR="/app/server/storage" `
2024-01-08 16:15:01 -08:00
mintplexlabs/anythingllm;
2023-12-20 11:33:00 -08:00
```
< / td >
< / tr >
< / table >
2023-12-06 11:27:38 -08:00
Go to `http://localhost:3001` and you are now using AnythingLLM! All your data and progress will persist between
container rebuilds or pulls from Docker Hub.
2023-11-17 20:15:11 -08:00
2023-12-19 13:24:50 -08:00
## How to use the user interface
2024-02-21 18:42:32 -08:00
2023-12-19 13:24:50 -08:00
- To access the full application, visit `http://localhost:3001` in your browser.
## About UID and GID in the ENV
2024-02-21 18:42:32 -08:00
2023-12-19 13:24:50 -08:00
- The UID and GID are set to 1000 by default. This is the default user in the Docker container and on most host operating systems. If there is a mismatch between your host user UID and GID and what is set in the `.env` file, you may experience permission issues.
## Build locally from source _not recommended for casual use_
2024-02-21 18:42:32 -08:00
2023-06-13 13:19:17 -07:00
- `git clone` this repo and `cd anything-llm` to get to the root directory.
2023-10-29 11:03:41 -07:00
- `touch server/storage/anythingllm.db` to create empty SQLite DB file.
2023-06-13 13:19:17 -07:00
- `cd docker/`
2023-11-01 22:12:30 -07:00
- `cp .env.example .env` **you must do this before building**
2023-06-16 13:53:32 +05:30
- `docker-compose up -d --build` to build the image - this will take a few moments.
2023-06-14 21:59:11 +05:30
Your docker host will show the image as online once the build process is completed. This will build the app to `http://localhost:3001` .
2023-06-13 13:19:17 -07:00
2024-03-25 17:44:16 -07:00
## Integrations and one-click setups
The integrations below are templates or tooling built by the community to make running the docker experience of AnythingLLM easier.
### Use the Midori AI Subsystem to Manage AnythingLLM
Follow the setup found on [Midori AI Subsystem Site ](https://io.midori-ai.xyz/subsystem/manager/ ) for your host OS
After setting that up install the AnythingLLM docker backend to the Midori AI Subsystem.
Once that is done, you are all set!
2024-03-19 09:04:33 -07:00
## Common questions and fixes
2024-02-21 18:42:32 -08:00
2024-03-19 09:04:33 -07:00
### Cannot connect to service running on localhost!
2024-02-21 18:42:32 -08:00
2024-03-19 09:04:33 -07:00
If you are in docker and cannot connect to a service running on your host machine running on a local interface or loopback:
2023-06-13 13:19:17 -07:00
2024-03-19 09:04:33 -07:00
- `localhost`
- `127.0.0.1`
- `0.0.0.0`
2023-06-13 13:19:17 -07:00
2024-03-19 09:04:33 -07:00
> [!IMPORTANT]
> On linux `http://host.docker.internal:xxxx` does not work.
> Use `http://172.17.0.1:xxxx` instead to emulate this functionality.
2023-06-13 13:19:17 -07:00
2024-03-19 09:04:33 -07:00
Then in docker you need to replace that localhost part with `host.docker.internal` . For example, if running Ollama on the host machine, bound to http://127.0.0.1:11434 you should put `http://host.docker.internal:11434` into the connection URL in AnythingLLM.
2023-06-13 13:19:17 -07:00
2023-08-15 11:36:07 -07:00
2023-08-15 11:37:03 -07:00
### API is not working, cannot login, LLM is "offline"?
2024-02-21 18:42:32 -08:00
2023-08-15 11:36:07 -07:00
You are likely running the docker container on a remote machine like EC2 or some other instance where the reachable URL
is not `http://localhost:3001` and instead is something like `http://193.xx.xx.xx:3001` - in this case all you need to do is add the following to your `frontend/.env.production` before running `docker-compose up -d --build`
2024-02-21 18:42:32 -08:00
2023-08-15 11:36:07 -07:00
```
# frontend/.env.production
GENERATE_SOURCEMAP=false
VITE_API_BASE="http://< YOUR_REACHABLE_IP_ADDRESS > :3001/api"
```
2024-02-21 18:42:32 -08:00
2023-08-15 11:36:07 -07:00
For example, if the docker instance is available on `192.186.1.222` your `VITE_API_BASE` would look like `VITE_API_BASE="http://192.186.1.222:3001/api"` in `frontend/.env.production` .
2024-02-21 18:42:32 -08:00
### Having issues with Ollama?
If you are getting errors like `llama:streaming - could not stream chat. Error: connect ECONNREFUSED 172.17.0.1:11434` then visit the README below.
[Fix common issues with Ollama ](../server/utils/AiProviders/ollama/README.md )
2023-08-15 11:37:03 -07:00
### Still not working?
2024-02-21 18:42:32 -08:00
[Ask for help on Discord ](https://discord.gg/6UyHPeGZAC )