I am trying to pass the env variable to my node js docker build image ,while running as shown below
stages:
- publish
- deploy
variables:
TAG_LATEST: $CI_REGISTRY_IMAGE/$CI_COMMIT_REF_NAME:latest
TAG_COMMIT: $CI_REGISTRY_IMAGE/$CI_COMMIT_REF_NAME:$CI_COMMIT_SHORT_SHA
publish:
image: docker:latest
stage: publish
services:
- docker:dind
script:
- touch env.txt
- docker build -t $TAG_COMMIT -t $TAG_LATEST .
- docker login -u gitlab-ci-token -p $CI_BUILD_TOKEN $CI_REGISTRY
- docker push $TAG_COMMIT
- docker push $TAG_LATEST
deploy:
image: alpine:latest
stage: deploy
tags:
- deployment
script:
- chmod og= $ID_RSA
- apk update && apk add openssh-client
- echo "AWS_ACCESS_KEY_ID"=$AWS_ACCESS_KEY_ID >> "env.txt"
- echo "AWS_S3_BUCKET"=$AWS_S3_BUCKET >> "env.txt"
- echo "AWS_S3_REGION"=$AWS_S3_REGION >> "env.txt"
- echo "AWS_SECRET_ACCESS_KEY"=$AWS_SECRET_ACCESS_KEY >> "env.txt"
- echo "DB_URL"=$DB_URL >> "env.txt"
- echo "JWT_EXPIRES_IN"=$JWT_EXPIRES_IN >> "env.txt"
- echo "OTP_EXPIRE_TIME_SECONDS"=$OTP_EXPIRE_TIME_SECONDS >> "env.txt"
- echo "TWILIO_ACCOUNT_SID"=$TWILIO_ACCOUNT_SID >> "env.txt"
- echo "TWILIO_AUTH_TOKEN"=$TWILIO_AUTH_TOKEN >> "env.txt"
- echo "TWILLIO_SENDER"=$TWILLIO_SENDER >> "env.txt"
- ssh -i $ID_RSA -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_IP "docker login -u gitlab-ci-token -p $CI_BUILD_TOKEN $CI_REGISTRY"
- ssh -i $ID_RSA -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_IP "docker pull $TAG_COMMIT"
- ssh -i $ID_RSA -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_IP "docker container rm -f my-app || true"
- ssh -i $ID_RSA -o StrictHostKeyChecking=no $SERVER_USER@$SERVER_IP "docker run --env-file env.txt -d -p 8080:8080 --name my-app $TAG_COMMIT"
environment:
name: development
url: 90900
only:
- master
I am running this command docker run –env-file env.txt ,but it gives me an error docker: open env.txt: no such file or directory.
How Can I solve the issue ,to pass multiple variables in my docker run command
2
Answers
Which job is failing? In your
deploy
job, you are creating theenv.txt
locally and using SSH to do the docker building, but you neverscp
your localenv.txt
to $SERVER_USER@$SERVER_ID for the remote process to pick it up.I had the same issue using Gitlab ci/cd. i.e. Trying to inject env vars that were referenced in the project .env file via the runner (docker executor) into the output docker container.
We don’t want to commit any sensitive info into git so one option is to save them on the server in a file and include via the –env-file flag but Gitlab runner creates a new container for every run so not possible to use this as the host server running the yaml script is ephemeral and not the actual server that Gitlab runner was installed onto.
The suggestion by @dmoonfire to scp the file over sounded like a good solution but I couldn’t get it to work to copy a file from external to the gitlab runner. I’d need to copy the public key from the executor to the gitlab runner server but the docker executor is ephemeral.
I found the simplest solution to use the Gitlab CI/CD variable settings. It’s possible to mask variables and restrict to protected branches or protected tags etc. These get injected into the container so that your .env file can access.