Last active
January 30, 2025 21:27
-
-
Save 84adam/02f25b50f0da6f5b1536156dff003a5e to your computer and use it in GitHub Desktop.
Open WebUI + Ollama + DeepSeek-R1-Distill-Qwen-1.5B
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# HW Requirements | |
# I recommend 8 CPU cores and 16 GB of RAM on the system or VM if using the deepseek-r1:1.5b model | |
# 1. install docker | |
# e.g. see: https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-rocky-linux-9 | |
# 2a. assuming an Nvidia GPU is available, run: | |
`docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama` | |
# 2b. if an Nvidia GPU is not available, run: | |
`docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama` | |
# 2c. alternate commands and docs here: https://docs.openwebui.com/ | |
# 3. access web ui | |
# open `http://localhost:3000/` | |
# 4. click drop-down, select add model | |
# type in: `deepseek-r1:1.5b`; select 'download from ollama' | |
# 5. select the model, and query away! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment