I would like to make a docker-compose which starts ollama (like ollama serve) on port 11434 and creates mymodel
from ./Modelfile
.
I found a similar question about how to run ollama with docker compose (Run ollama with docker-compose and using gpu), but I could not find out how to create the model then.
I tried to use the following:
version: '3'
services:
ollama:
image: ollama/ollama:latest
container_name: ollama
ports:
- "11434:11434"
volumes:
- ollama_volume:/root/.ollama
command: ollama create mymodel -f ./Modelfile
volumes:
ollama_volume:
This fails with unknown command "ollama" for "ollama"
, so I thought maybe command line ollama is not installed so I could use curl and their API, but curl also does not work..
I saw some people using bash -c "some command"
, but bash
is apparently also not found.
How could I create the model from within the docker-compose
? (If it is possible)
2
Answers
This docker compose works for me. I am using a shell entrypoint to run commands, you can add your desired commands in the shell file.
docker-compose file:
Dockerfile for ollama:
Shell file for running ollama commands:
Hope this helps.
Change the host port to 11435 and re run it should work
Here I have ollama and ollama webui
docker-compose.yml
https://github.com/jinnabaalu/infinite-docker-compose/blob/main/ollama/docker-compose.ymlHere I have created the video on https://www.youtube.com/