skip to Main Content

We have a Python project which required postgres docker image. The application runs fine on local system after starting docker container.
We have added two gitlab jobs, one is to start the docker container and another is to run the python script, but it seems that the second job is not executing successfully which depends on the first job. Is there any reason of not having access of docker container in the 2nd job?

gitlab-ci.yml

stages:
  - build
  - test

build:
  image: docker/compose:latest
  services:
    - docker:dind

  script:
    - docker-compose down --v
    - docker-compose up -d

test:
  image: python:3.8
  needs: [build]
  variables:
    POETRY_VERSION: "1.1.15"
    POETRY_CORE_VERSION: "1.0.8"

  script:
    - python --version
    - POETRY_VIRTUALENVS_IN_PROJECT=true
    - pip install poetry==${POETRY_VERSION} poetry-core==${POETRY_CORE_VERSION}
    - poetry install --no-interaction --no-ansi
    - poetry run pytest

2

Answers


  1. If you need the docker compose project running in the test stage, add it to the test stage and up it there in the script.

    A recent docker:dind service should include the docker compose plugin 1 (note it is without the dash, so "docker compose" and not "docker-compose").

    ---
    
    stages:
      - build # this may be required due to compose project requirements, so I kept it in the example
      - test
    
    build: # this may be required due to compose project requirements, so I kept it in the example
      # ...
    
    test:
      image: python:3.8
      services:
        - docker:dind
    
      needs: [build] # this may be required due to compose project requirements, so I kept it in the example
      variables:
        POETRY_VERSION: "1.1.15"
        POETRY_CORE_VERSION: "1.0.8"
    
      script:
        - >-
          docker compose down --v || :
          this looks borked, but we keep it for
          the original meaning of the OPs example
          as the meaning seems clear
        - docker compose up -d
        - python --version
        - POETRY_VIRTUALENVS_IN_PROJECT=true
        - pip install poetry==${POETRY_VERSION} poetry-core==${POETRY_CORE_VERSION}
        - poetry install --no-interaction --no-ansi
        - poetry run pytest
    
    # Also you may want to use `after_script:` to down the 
    # project so it does not continue to run after test stage
    # script has finished.
    
      after_script:
        - >-
          docker compose down --v || :
          this looks borked, but we keep it for
          the original meaning of the OPs example
          as the meaning seems clear
    

    And further for the TLDR:

    Why gitlab jobs are not sharing docker image?

    Because different jobs.


    1. https://hub.docker.com/r/docker/compose
    Login or Signup to reply.
  2. Solution 1

    To run compose and the script with the Python base image, you could do the following:

    1. Add the script to a bash script like script.sh:
    python --version
    POETRY_VIRTUALENVS_IN_PROJECT=true
    pip install poetry==${POETRY_VERSION} poetry-core==${POETRY_CORE_VERSION}
    poetry install --no-interaction --no-ansi
    poetry run pytest
    
    1. Change the job as follows:
    test:
      services:
        - docker:dind
      image: docker/compose:latest
      # needs: [build]
      variables:
        POETRY_VERSION: "1.1.15"
        POETRY_CORE_VERSION: "1.0.8"
    
      script:
        - docker-compose up -d
        - docker run --network=host -v $PWD/script.sh:/script.sh python:3.8 bash /script.sh
    

    Solution 2

    Install the Python for the docker/compose base image and run the script.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search