I want to run Ollama docker image. This exposes entire API with its endpoints.
I want to secure it by adding reverse proxy on top of it and i have chosen Nginx for the job.
So I need to add Nginx to my docker iamge with Ollama.
So i have constructed such Dockerfile:
FROM ollama/ollama:latest
FROM nginx:latest
COPY /.nginx/nginx.conf /etc/nginx/conf.d/default.conf
My Nginx configuration:
server {
listen 80;
location /api/tags {
proxy_pass http://localhost:11434/api/tags;
proxy_set_header Host localhost:11434;
}
# Default: Return 404 for all other endpoints
location / {
return 404;
}
}
Here I allow just one endpoint to be proxied to Ollama.
In order to test it, I tried some endpoints and they returned correctly 404.
But when I have tried api/tags
path, I got "Bad gateway" 502 error.
In docker logs I could see:
2024-12-01 22:44:41 2024/12/01 21:44:41 [warn] 29#29: *1 upstream server temporarily disabled while connecting to upstream, client: 172.17.0.1, server: , request: "GET /api/tags HTTP/1.1", upstream: "http://127.0.0.1:11434/api/tags", host: "localhost:8000"
2024-12-01 22:44:41 2024/12/01 21:44:41 [error] 29#29: *1 connect() failed (111: Connection refused) while connecting to upstream, client: 172.17.0.1, server: , request: "GET /api/tags HTTP/1.1", upstream: "http://127.0.0.1:11434/api/tags", host: "localhost:8000"
2024-12-01 22:44:41 2024/12/01 21:44:41 [warn] 29#29: *1 upstream server temporarily disabled while connecting to upstream, client: 172.17.0.1, server: , request: "GET /api/tags HTTP/1.1", upstream: "http://127.0.0.1:11434/api/tags", host: "localhost:8000"
2024-12-01 22:44:41 172.17.0.1 – – [01/Dec/2024:21:44:41 +0000] "GET /api/tags HTTP/1.1" 502 559 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36 OPR/114.0.0.0" "-"
It seems that Ollama server is not running, but I don’t know:
- How to check running processes in Docker,
ps
command is not found in docker. - How to run Ollama server?
- How to define in Docker how start Ollama server automatically?
2
Answers
I think that the main problem is that the Dockerfile and the current configuration don’t allow both services (Ollama and Nginx) to be started simultaneously.
Why ?
(1) Incomplete Dockerfile
The original only contains:
This Dockerfile does not specify how to launch the services or how they will interact together.
(2) The logs are clear about this
In the provided logs, clear connection errors are visible:
This error indicates that Nginx cannot connect to Ollama, suggesting that the Ollama service is not started or not accessible.
You can’t create a Docker image containing multiple programs the way you’re trying. The Dockerfile you have will result in an image that only contains Nginx and not Ollama. It’s also generally recommended to only have one process running in a container.
I’d recommend running Ollama and Nginx in each their own containers and orchestrate them using Docker Compose.
Create a
docker-compose.yml
file containingThen change your nginx.conf file so it passes the request to the Ollama container by using the
ollama
service name instead oflocalhost
like this:Your Dockerfile will work as is, since the first line is effectively ignored. But to keep things tidy, you should change it to
docker compose up -d
will start both containers. When a request comes in to Nginx onhttp://localhost/api/tags
it’ll pass it on to the Ollama container on/api/tags
on port 11434.