User Tools

Site Tools


ai:opw:opw

This is an old revision of the document!


Open WebUI

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution.

Installation

For Ubuntu Users

Set up Docker’s apt repository:

sudo apt-get update
sudo apt-get install ca-certificates curl
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc
echo \
  "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \
  $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
  sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

Note:If using an Ubuntu derivative (e.g., Linux Mint), use UBUNTU_CODENAME instead of VERSION_CODENAME.

Install Docker Engine:

  sudo apt-get update
  sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin

Verify Docker Installation:

  sudo docker run hello-world

Step 1: Pull the Open WebUI Image

Start by pulling the latest Open WebUI Docker image from the GitHub Container Registry.

  docker pull ghcr.io/open-webui/open-webui:main

Step 2: Run the Container

Run the container with default settings. This command includes a volume mapping to ensure persistent data storage.

  docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main

Important Flags

Volume Mapping (-v open-webui:/app/backend/data): Ensures persistent storage of your data. This prevents data loss between container restarts. Port Mapping (-p 3000:8080): Exposes the WebUI on port 3000 of your local machine.

Using GPU Support

For Nvidia GPU support, add –gpus all to the docker run command:

  docker run -d -p 3000:8080 --gpus all -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-   webui:cuda

Single-User Mode (Disabling Login)

To bypass the login page for a single-user setup, set the WEBUI_AUTH environment variable to False:

  docker run -d -p 3000:8080 -e WEBUI_AUTH=False -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main

warning

You cannot switch between single-user mode and multi-account mode after this change. Advanced Configuration: Connecting to Ollama on a Different Server

To connect Open WebUI to an Ollama server located on another host, add the OLLAMA_BASE_URL environment variable:

  docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Access the WebUI

After the container is running, access Open WebUI at:

  http://localhost:3000

For detailed help on each Docker flag, see Docker's documentation. Updating

To update your local Docker installation to the latest version, you can either use Watchtower or manually update the container. Option 1: Using Watchtower

With Watchtower, you can automate the update process:

 docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui

(Replace open-webui with your container's name if it's different.) Option 2: Manual Update

Stop and remove the current container:

 docker rm -f open-webui

Pull the latest version:

 docker pull ghcr.io/open-webui/open-webui:main

Start the container again:

 docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main

Both methods will get your Docker instance updated and running with the latest build.

ai/opw/opw.1740497223.txt.gz · Last modified: 2025/02/25 15:27 by 85.219.17.206