Skip to content

Instantly share code, notes, and snippets.

@kerrytazi
Created January 28, 2025 04:06
Show Gist options
  • Save kerrytazi/d48efc2b2f5f48b17514ee31575bac21 to your computer and use it in GitHub Desktop.
Save kerrytazi/d48efc2b2f5f48b17514ee31575bac21 to your computer and use it in GitHub Desktop.
Docker script to run deepseek-r1 locally.
## Create directories to mount
# mkdir ollama
# mdkir webui
## Run docker compose
# docker compose up -d
## Download models (Run once. It will store model in 'ollama' dir)
# docker exec -it ollama ollama pull deepseek-r1:8b
## Other models are listed at the end of the file
## Open web page (webui might take up to a few minutes to start first time)
# http://localhost:3000
services:
ollama:
image: ollama/ollama:0.5.7
container_name: ollama
networks:
- ollama-net
## Expose api
# ports:
# - "11434:11434"
volumes:
- ./ollama:/root/.ollama
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
webui:
image: ghcr.io/open-webui/open-webui:main # Why the f*ck they don't use versions
container_name: webui
networks:
- ollama-net
ports:
- "3000:8080"
volumes:
- ./webui:/app/backend/data
environment:
- OLLAMA_BASE_URL=http://ollama.ollama-net:11434
networks:
ollama-net:
name: ollama-net
## Misc
## Other models
# docker exec -it ollama ollama pull deepseek-r1:1.5b
# docker exec -it ollama ollama pull deepseek-r1:7b
# docker exec -it ollama ollama pull deepseek-r1:8b
# docker exec -it ollama ollama pull deepseek-r1:14b
# docker exec -it ollama ollama pull deepseek-r1:32b
## Run from cmd (In case you want to check from console)
# docker exec -it ollama ollama run deepseek-r1:8b
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment