Ollama on one machine, accessible from another. One environment variable, restart, test. Five minutes. But mess up the firewall or skip the reverse proxy, and you’ve joined 175,000 exposed servers discovered by internet-wide scans as of early 2025.
How This Works
Ollama binds to 127.0.0.1 port 11434 by default – localhost only. Other devices? Blocked. You’ll change that binding to 0.0.0.0 (all interfaces) or a specific IP. Method varies by OS, but the mechanics are identical.
Fast Method: Environment Variable
Change bind address with OLLAMA_HOST environment variable. Where you set it depends on how Ollama runs – systemd service, macOS app, Windows executable.
Linux (systemd service)
Most Linux installs: systemd service. Edit with systemctl edit ollama.service, add Environment line under [Service], save:
sudo systemctl edit ollama.service
Add this:
[Service]
Environment="OLLAMA_HOST=0.0.0.0:11434"
Reload. Restart.
sudo systemctl daemon-reload
sudo systemctl restart ollama
macOS
macOS application requires launchctl for environment variables, then restart the app:
launchctl setenv OLLAMA_HOST "0.0.0.0:11434"
Quit Ollama from menu bar. Relaunch.
Windows
Edit environment variables via Control Panel, add or edit OLLAMA_HOST, save, restart Ollama from Start menu.
- Search “environment variables” in Start
- Click “Edit environment variables for your account”
- New variable:
OLLAMA_HOST=0.0.0.0:11434 - OK, close dialogs
- Quit Ollama (system tray), restart from Start
Test It
Same machine first:
curl http://localhost:11434
Shows “Ollama is running” when server is active.
From another device, swap localhost for the server IP (ip addr on Linux, ipconfig on Windows, or check router):
curl http://192.168.1.100:11434
Working? Done. Times out? Firewall probably blocked it – most do by default.
List installed models:
curl http://192.168.1.100:11434/api/tags– if you see JSON with model names, API is fully accessible.
Where It Fails: Edge Cases
Windows: localhost works, IP doesn’t
Set OLLAMA_HOST=0.0.0.0, localhost:11434 responds, but host IP and 127.0.0.1 don’t. GitHub issue #8304: Windows users can only access via localhost:11434 despite 0.0.0.0 binding and disabled firewall. No fix yet. SSH tunnel or WSL2 instead of native Windows.
Docker networking quirk
Ollama in Docker, connecting from host or another container? localhost won’t reach it. Use host.docker.internal instead – set base URL to http://host.docker.internal:11434.
The catch: Docker Desktop maps host.docker.internal automatically. Bare Linux Docker doesn’t – add --add-host=host.docker.internal:host-gateway when you start the container.
docker run --add-host=host.docker.internal:host-gateway ...
Browser clients hit CORS
Port 11434 exposed. Curl works. Web UI throws CORS errors.
Ollama allows cross-origin requests from 127.0.0.1 and 0.0.0.0 by default; other origins need OLLAMA_ORIGINS. Web UI at http://192.168.1.50:3000? Add this alongside OLLAMA_HOST:
OLLAMA_ORIGINS=http://192.168.1.50:3000
Or allow all (risky on untrusted networks):
OLLAMA_ORIGINS=*
Security
Most tutorials skip this. Don’t.
Ollama has no native authentication. Zero. Anyone reaching port 11434 can run prompts, list models, consume GPU cycles. Internet-wide scans identified 175,000 exposed Ollama servers as of early 2025. Most unintentional.
Never bind to 0.0.0.0 on a machine with public IP without reverse proxy. On LANs, exposing port 11434 directly to internet is not recommended – Ollama has no built-in authentication. For external access: Nginx with Basic Auth, Cloudflare Tunnel with SSO, or VPN. Reverse proxy configs exist.
Alternative: Secure Tunneling
Occasional remote access without reconfiguring Ollama or firewall? Tunneling apps like Ngrok, Pinggy, Cloudflare Tunnel, Localtonet forward port 11434 through encrypted connections.
Ngrok example:
ngrok http 11434 --host-header="localhost:11434"
Public HTTPS URL forwarding to local Ollama. Tunnel encrypts traffic. Revoke anytime by stopping the command. Good for demos or one-off sessions.
Debugging: Port Conflict
Ollama won’t start. Logs say “address already in use”. Port 11434 is Ollama’s default communication channel – something else grabbed it.
Find what’s using it:
# macOS/Linux
lsof -i :11434
# Windows
netstat -ano | findstr :11434
Nine times out of ten: another Ollama instance you forgot to stop, or Docker container that didn’t clean up. Kill the process or move Ollama to a different port:
OLLAMA_HOST=0.0.0.0:11435
FAQ
Can I expose Ollama on my home network safely?
Yes, if your router doesn’t forward port 11434 externally. Binding to 0.0.0.0 behind a router only exposes to LAN devices. Risk: shared networks (coffee shop, office) or UPnP accidentally opening the port. Bind to your specific LAN IP (192.168.1.100:11434) instead of 0.0.0.0 if uncertain.
Why does curl work but my app can’t connect?
Three culprits: (1) CORS – browser apps need OLLAMA_ORIGINS, (2) Docker – use host.docker.internal not localhost if containerized, (3) Firewall blocks LAN traffic. sudo ufw allow 11434/tcp (Ubuntu/Debian) or sudo firewall-cmd --add-port=11434/tcp --permanent (RHEL/CentOS). Test again.
Should I change the default port 11434?
Change it when you hit a conflict or want mild obscurity. Default ports make fingerprinting trivial; changing ports and disabling verbose banners helps. Won’t stop determined attackers, but automated scans skip non-standard ports. OLLAMA_HOST=0.0.0.0:8080 or any free port. Restart Ollama, update client configs. Remember your choice – no config file tracks it.