Skip to content

Instantly share code, notes, and snippets.

@thienandangthanh
Last active February 12, 2025 05:04
Show Gist options
  • Save thienandangthanh/aded0774d23468368fc820d22fbbc9a6 to your computer and use it in GitHub Desktop.
Save thienandangthanh/aded0774d23468368fc820d22fbbc9a6 to your computer and use it in GitHub Desktop.
Ollama and OpenWebUI setup on Archlinuxn with Docker

Instructions

NVIDIA Container Toolkit

Install NVIDIA Container Toolkit

sudo pacman -S nvidia-container-toolkit

Configure Docker to use NVIDIA GPU

sudo nvidia-ctk runtime configure --runtime=docker
sudo systemctl restart docker

Run Ollama and OpenWebUI

Option 1 - start Ollama with bundled OpenWebUI

docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama

Option 2 - start Ollama and OpenWebUI separately

Start Ollama

docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

Start OpenWebUI

docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda

Pull models

Access docker container shell

docker exec -it open-webui bash

Pull a model, for example deepseek-r1:1.5b

ollama pull deepseek-r1:1.5b

References

https://github.com/ollama/ollama

https://hub.docker.com/r/ollama/ollama

https://ollama.com/library/deepseek-r1:1.5b

https://github.com/open-webui/open-webui

https://archlinux.org/packages/extra/x86_64/nvidia-container-toolkit/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment