How to fix Connection Issue or Update Needed on Ollama

3.1k Views Asked by At

I'm getting a "Ollama Version: Not Detected" and a "Open WebUI: Server Connection Error" after installing Webui on ubuntu with:

sudo docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=http://host.docker.internal:11434/api --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Going on Settings > General > Ollama api url I have: http://localhost:11434/api

Running Ollama serve gives me:

Error: listen tcp 127.0.0.1:11434: bind: address already in use

Going to http://127.0.0.1:11434/ I get presented with: Ollama is running
Going to http://localhost:11434/api gives me: 404 not found
But going to http://localhost:11434/api/version gives me: version: "0.1.25"
I also did sudo ufw disable to make sure it wasn't a firewall issue

I've tried this: https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux
And doing sudo systemctl edit ollama.service gives me a file, like this:

### Editing /etc/systemd/system/ollama.service.d/override.conf
### Anything between here and the comment below will become the new contents of the file

[Unit]
Environment="OLLAMA_HOST=0.0.0.0:11434"
Environment="OLLAMA_ORIGINS=*"

### Lines below this comment will be discarded

### /etc/systemd/system/ollama.service
# [Unit]
# Description=Ollama Service
# After=network-online.target
# 
# [Service]
# ExecStart=/usr/local/bin/ollama serve
# User=ollama
# Group=ollama
# Restart=always
# RestartSec=3
# Environment="PATH=/home/myuser/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin"
# 
# [Install]
# WantedBy=default.target

I've tried also doing:

sudo docker stop open-webui
sudo docker rm -f open-webui
#variations of the install command

If I use curl:

different device on same network curl curl http://192.168.y.x:11434/api:
curl: (7) Failed to connect to 192.168.y.x port 11434: Connection refused

same device curl curl http://192.168.y.x:11434/api
curl: (7) Failed to connect to 192.168.y.x port 11434 after 0 ms: Connection refused

same device curl curl http://127.0.0.1:11434/api
404 page not found

Curl inside docker docker exec -it open-webui sh:

curl http://192.168.y.x:11434/api -> curl: (7) Failed to connect to 192.168.y.x port 11434 after 0 ms: Couldn't connect to server
curl http://127.0.0.1:11434/api -> curl: (7) Failed to connect to 127.0.0.1 port 11434 after 0 ms: Couldn't connect to server
curl http://host.docker.internal/api ->

<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL was not found on this server.</p>
<hr>
<address>Apache/2.4.52 (Ubuntu) Server at host.docker.internal Port 80</address>
</body></html>

Can someone tell me what do I have to do to fix this?

1

There are 1 best solutions below

2
On BEST ANSWER

Apologies if I have got the wrong end of the stick. I don't know much about this.

I gather that you are running Ollama on your host machine and you are trying to access it on port 11434 at host.docker.internal, which is a Docker Desktop feature I believe. I don't use Docker Desktop. However, if the goal is just to get this working then you can start by using Docker host network.

docker run --rm \
  -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api \
  --add-host=127.0.0.1:host-gateway \
  --net=host \
  --name open-webui \
  ghcr.io/open-webui/open-webui:main

The --net=host will mean that the UI process will be able to access Ollama directly at 127.0.0.1:11434 on the host. Presumably this is what you are trying to do via host.docker.internal in Docker Desktop.

Then visit http://127.0.0.1:8080/.

Perhaps using Docker Compose would work well, with Ollama running in one service and the UI in another?

enter image description here

Adding in more options from the original question:

docker run \
  -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api \
  --add-host=127.0.0.1:host-gateway \
  --net=host \
  --name open-webui \
  -v open-webui:/app/backend/data \
  --restart always \
  ghcr.io/open-webui/open-webui:main

The --rm option has been removed since it is mutually exclusive with --restart always.

To install and start Ollama on Linux you'd do something like:

curl -fsSL https://ollama.com/install.sh | sh

It will download and install, then start the Ollama server.