I’m using Google cloud build for CI/CD for my django app, and one requirement I have is to set my GOOGLE_APPLICATION_CREDENTIALS
so I can perform authenticated actions in my Docker build. For example, I need to run RUN python manage.py collectstatic --noinput
which requires access to my Google cloud storage buckets.
I’ve generated the credentials and it works well when simply including it in my (currently private) repo as a .json file, so it gets pulled into my Docker container with the COPY . .
command and setting the env variable with ENV GOOGLE_APPLICATION_CREDENTIALS=credentials.json
. Ultimately, I want to grab the credential value from secret manager and create the credentials file during the build stage, so I can completely remove the credentials from the repo. I tried doing this with editing cloudbuild.yaml
(referencing this doc) with various implementations of the availableSecrets
config, $$SECRET
syntax, and build-args
in the docker build command and trying to access in Dockerfile with
ARG GOOGLE_BUILD_CREDS
RUN echo "$GOOGLE_BUILD_CREDS" >> credentials.json
ENV GOOGLE_APPLICATION_CREDENTIALS=credentials.json
with no success.
If someone could advise me how to implement this in my cloudbuild.yaml and Dockerfile if its possible, or if there’s another better solution altogether, would be much appreciated.
This is the relevant part of my cloudbuild.yaml
steps:
- name: gcr.io/cloud-builders/docker
args:
- build
- '--no-cache'
- '-t'
- '$_GCR_HOSTNAME/$PROJECT_ID/$REPO_NAME/$_SERVICE_NAME:$COMMIT_SHA'
- .
- '-f'
- Dockerfile
id: Build
availableSecrets:
secretManager:
- versionName: projects/PROJECT_ID/secrets/CREDENTIALS/versions/latest
env: 'CREDENTIALS'
2
Answers
I think I've worked out a fix. To solve the error I mentioned in my reply to @guillaume-blaquiere, I updated my build args in
cloudbuild.yaml
to include--network=cloudbuild
, allowing me access to the correct service account credentials (credit to this answer). The next issue I faced is with the django-storages library, returning this exceptionI then came across this suggestion to add the setting GS_QUERYSTRING_AUTH = False to my django config and this seems to do the trick. My only conern is the documentation here does not go into too much detail on impacts or risks of disabling this (the bucket is public-read as it recommends). It seems to be working as intended however. So I will go with this configuration unless a better solution is put forward.
If your container will run on Cloud Run, it’s super easy: Remove the service account key file (and roughly, in most use cases, you never ever need it).
Keep in mind that a service account key file is a secret with a private key. And if you put it in your container, you simply store it in plain text. So bad for a secret!! (with dive, you can explore your container content, and steal the secret if you have access to the container directly)
But, I’m sure you know that because you want to store the secret in a secret manager. Now a question? How do you access a secret manager? Do you need a service account key file to be authenticated to access it?
In fact not.
The solution is to use ADC (Application default credentials). With the client libraries, use the get default credential method to let the library determine automatically the platform and the credential to use
On Cloud Run (as any other Google Cloud services), you have a metadata server that allows client libraries to get credentials information from the runtime service account.
On your local environment, you have 2 options:
gcloud auth application-default login
. It’s your own credential and permissions, not exactly the same as the Cloud Run runtime environmentgcloud auth application-default login --impersonate-service-account=<service account email>
, Be sure to have the role service account token creator on the service account.And then, run your app locally, and let the ADC use the credentials