README.md |
llux
llux is an AI chatbot for the Matrix chat protocol. It uses local LLMs via Ollama and supports both image recognition and image generation. Each user in a Matrix room can set a unique personality (or system prompt), and conversations are kept per user, per channel. Model switching (OpenAI or Ollama) is also supported if you have multiple models configured.
Getting Started
-
Install Ollama
You’ll need Ollama to run local LLMs. A quick install:
curl https://ollama.ai/install.sh | sh
Then pull your preferred model(s) with
ollama pull <modelname>
. -
Install matrix-nio
pip3 install matrix-nio
-
Set Up Your Bot
- Create a Matrix account for your bot (on a server of your choice).
- Record the server, username, and password.
- Copy
config.yaml-example
toconfig.yaml
(e.g.,cp config.yaml-example config.yaml
). - In your new
config.yaml
, fill in the relevant fields (Matrix server, username, password, channels, admin usernames, etc.). Also configure the Ollama section for your model settings and the Diffusers section for image generation (model, device, steps, etc.).
-
Run llux
python3 llux.py
Usage
-
.ai message or botname: message
Basic conversation or roleplay prompt. -
.x username message
Interact with another user’s chat history (use the display name of that user). -
.persona personality
Set or change to a specific roleplaying personality. -
.custom prompt
Override the default personality with a custom system prompt. -
.reset
Clear your personal conversation history and revert to the preset personality. -
.stock
Clear your personal conversation history, but do not apply any system prompt.
Admin Commands
-
.model modelname
- Omit
modelname
to show the current model and available options. - Include
modelname
to switch to that model.
- Omit
-
.clear
Reset llux for everyone, clearing all stored conversations and returning to the default settings.