skip to Main Content

I would like to pass my Google Cloud Platform’s service account JSON credentials file to a docker container so that the container can access a cloud storage bucket. So far I tried to pass the file as an environment parameter on the run command like this:

  • Using the –env flag: docker run -p 8501:8501 --env GOOGLE_APPLICATION_CREDENTIALS=/Users/gcp_credentials.json" -t -i image_name
  • Using the -e flag and even exporting the same env variable in the command line: docker run -p 8501:8501 -e GOOGLE_APPLICATION_CREDENTIALS=/Users/gcp_credentials.json" -t -i image_name

But nothing worked, and I always get the following error when running the docker container:

W
external/org_tensorflow/tensorflow/core/platform/cloud/google_auth_provider.cc:184] All attempts to get a Google authentication bearer token failed,
returning an empty token. Retrieving token from files failed with "Not
found: Could not locate the credentials file.".

How to pass the google credentials file to a container running locally on my personal laptop?

3

Answers


  1. You cannot "pass" an external path, but have to add the JSON into the container.

    Login or Signup to reply.
  2. Two ways to do it:

    secrets – work with docker swarm mode.

    • create docker secrets
    • use secret with a container using –secret

    Advantage being, secrets are encrypted. Secrets are decrypted when mounted to containers.

    Login or Signup to reply.
  3. I log into gcloud in my local environment then share that json file as a volume in the same location in the container.

    Here is great post on how to do it with relevant extract below: Use Google Cloud user credentials when testing containers locally

    Login locally

    To get your default user credentials on your local environment, you
    have to use the gcloud SDK. You have 2 commands to get authentication:

    gcloud auth login to get authenticated on all subsequent gcloud
    commands gcloud auth application-default login to create your ADC
    locally, in a “well-known” location.

    Note location of credentials

    The Google auth library tries to get a valid credentials by performing
    checks in this order

    Look at the environment variable GOOGLE_APPLICATION_CREDENTIALS value.
    If exists, use it, else… Look at the metadata server (only on Google
    Cloud Platform). If it returns correct HTTP codes, use it, else… Look
    at “well-know” location if a user credential JSON file exists The
    “well-known” locations are

    On linux: ~/.config/gcloud/application_default_credentials.json On
    Windows: %appdata%/gcloud/application_default_credentials.json

    Share volume with container

    Therefore, you have to run your local docker run command like this

    ADC=~/.config/gcloud/application_default_credentials.json docker run

    -e GOOGLE_APPLICATION_CREDENTIALS=/tmp/keys/FILE_NAME.json
    -v ${ADC}:/tmp/keys/FILE_NAME.json:ro <IMAGE_URL>

    NB: this is only for local development, on Google Cloud Platform the credentials for the service are automatically inserted for you.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search