skip to Main Content

I would like to make a docker-compose which starts ollama (like ollama serve) on port 11434 and creates mymodel from ./Modelfile.

I found a similar question about how to run ollama with docker compose (Run ollama with docker-compose and using gpu), but I could not find out how to create the model then.

I tried to use the following:

version: '3'

services:
  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama_volume:/root/.ollama
    command: ollama create mymodel -f ./Modelfile

volumes:
  ollama_volume:

This fails with unknown command "ollama" for "ollama", so I thought maybe command line ollama is not installed so I could use curl and their API, but curl also does not work..

I saw some people using bash -c "some command", but bash is apparently also not found.

How could I create the model from within the docker-compose? (If it is possible)

2

Answers


  1. This docker compose works for me. I am using a shell entrypoint to run commands, you can add your desired commands in the shell file.

    docker-compose file:

    ollama:
        build:
          context: .
          dockerfile: ./Dockerfile.ollama
        image: ollama
        container_name: ollama
        env_file: env
        entrypoint: /tmp/run_ollama.sh
        ports:
          - 11434:11434
        volumes:
          - .:/app/
          - ./ollama/ollama:/root/.ollama
        pull_policy: always
        tty: true
        restart: always
        networks:
          - net
    

    Dockerfile for ollama:

    FROM ollama/ollama
    
    COPY ./run_ollama.sh /tmp/run_ollama.sh
    
    WORKDIR /tmp
    
    RUN chmod +x run_ollama.sh
    
    EXPOSE 11434
    

    Shell file for running ollama commands:

    #!/bin/bash
    
    echo "Starting Ollama server..."
    ollama serve &
    ollama run llama3
    
    
    echo "Waiting for Ollama server to be active..."
    while [ "$(ollama list | grep 'NAME')" == "" ]; do
      sleep 1
    done
    

    Hope this helps.

    Login or Signup to reply.
  2. Change the host port to 11435 and re run it should work

    version: '3.8'
    
    services:
      ollama:
        image: ollama/ollama:latest
        container_name: ollama
        ports: ["11435:11434"] # change the host port 11435
        volumes:
          - ollama:/root/.ollama
        pull_policy: always
        tty: true
        restart: unless-stopped
    

    Here I have ollama and ollama webui docker-compose.yml https://github.com/jinnabaalu/infinite-docker-compose/blob/main/ollama/docker-compose.yml

    Here I have created the video on https://www.youtube.com/

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search