Set >=6Gb RAM required for offline chat

Llama v2 7B with 4bit quantization technically needs ~3.5Gb RAM (7B * 0.5byte), practically a system with 6Gb of RAM should suffice
This commit is contained in:
Debanjum 2023-10-18 12:05:54 -07:00 committed by GitHub
parent 8346e1193c
commit d93395ae48
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -10,7 +10,7 @@
Offline chat stays completely private and works without internet. But it is slower, lower quality and more compute intensive.
> **System Requirements**:
> - You need at least **16 GB of RAM** and **4 GB of Disk**
> - Machine with at least **6 GB of RAM** and **4 GB of Disk** available
> - A CPU supporting [AVX or AVX2 instructions](https://en.wikipedia.org/wiki/Advanced_Vector_Extensions) is required
> - A Mac M1+ or [Vulcan supported GPU](https://vulkan.gpuinfo.org/) should significantly speed up chat response times