We have a Python project which required postgres docker image. The application runs fine on local system after starting docker container.
We have added two gitlab jobs, one is to start the docker container and another is to run the python script, but it seems that the second job is not executing successfully which depends on the first job. Is there any reason of not having access of docker container in the 2nd job?
gitlab-ci.yml
stages:
- build
- test
build:
image: docker/compose:latest
services:
- docker:dind
script:
- docker-compose down --v
- docker-compose up -d
test:
image: python:3.8
needs: [build]
variables:
POETRY_VERSION: "1.1.15"
POETRY_CORE_VERSION: "1.0.8"
script:
- python --version
- POETRY_VIRTUALENVS_IN_PROJECT=true
- pip install poetry==${POETRY_VERSION} poetry-core==${POETRY_CORE_VERSION}
- poetry install --no-interaction --no-ansi
- poetry run pytest
2
Answers
If you need the docker compose project running in the test stage, add it to the test stage and up it there in the script.
A recent
docker:dind
service should include the docker compose plugin 1 (note it is without the dash, so "docker compose" and not "docker-compose").And further for the TLDR:
Because different jobs.
Solution 1
To run
compose
and thescript
with the Python base image, you could do the following:script.sh
:Solution 2
Install the Python for the
docker/compose
base image and run the script.