I have a project where I have several branches, the main brach, the develop branch and then n dynamic feature branches.
So for example the docker-build job in my GitLab CI Pipeline fetches some environment variables in the before_script section from a file called either .main.env or .develop.env (depending on the branch name) using the source command.
docker-build:
image: docker:latest
stage: package
tags:
- deployment
before_script:
- source .${CI_COMMIT_REF_NAME}.env # || source .develop.env
services:
- docker:dind
script:
- docker build --build-arg SPRING_ACTIVE_PROFILE=$SPRING_ACTIVE_PROFILE -t $TAG_COMMIT -t $TAG_LATEST . -f Dockerfile
- docker login -u gitlab-ci-token -p $CI_BUILD_TOKEN $CI_REGISTRY
- docker push $TAG_COMMIT
- docker push $TAG_LATEST
As you can see I tried adding the conditional command source .develop.env
in case the file in the first command wasn’t found, this does not work though. It still throws an error saying that the file .SOME_FEATURE_BRANCH.env couldn’t be found.
Now I’m wondering if there is any better way to solve this than making another job just for feature branches that simply source the .develop.env file?
2
Answers
The error will be written to STDERR when the first source is run and fails. Then it will go on to execute the second source. If you want to supress the error message in the first source command you can redirect the STDOUT to /dev/null.
Probably better to just be explicit and check for the existence of the file.
You could also accomplish something similar with
rules:variables:
, which can be handy if you want to decouple your shell scripts from being concerned about details of the CI environment.