After installed ollama in ubuntu, the usual use will be fine, but when want to access it from another machine (like from your laptop to home lab server), will see
yli@yli-hx90g:~$ curl http://192.168.0.163:11434/api/generate -d '{"model": "deepseek-r1:7b", "keep_alive": -1}'
curl: (7) Failed to connect to 192.168.0.163 port 11434 after 3 ms: Connection refused
this is because the daemon ollama started by systemctl is not setting OLLAMA_HOST correctly (default is 127.0.0.1:11434)
sudo systemctl edit ollama.service
then add following
[Service]
# can change the port 11434 to other port like 8080
Environment="OLLAMA_HOST=0.0.0.0:11434"
# if want to customize your model store location can remove the comment # of below
# Environment="OLLAMA_MODELS=/path/to/your/models"
after save, do
systemctl daemon-reload
systemctl restart ollama
then try again, it will work
yli@yli-hx90g:~$ curl http://192.168.0.163:11434/api/generate -d '{"model": "deepseek-r1:7b", "keep_alive": -1}'
{"model":"deepseek-r1:7b","created_at":"2025-05-15T23:47:26.969733557Z","response":"","done":true,"done_reason":"load"}
with systemctrl can
journalctl -u ollama.service
and looking for line INFO server config env=
, and then further to find OLLAMA_HOST:http://...