In a Docker image which runs some scripts and at the end of the script I use the bq
command to append the data to a BigQuery table. This all works fine offline when I don’t run it within Docker since my local dev environmen has the proper authentication set up.
The issue is how do I set up authentication so that I can run bq
from within docker when I do docker run -t my-image
.
Also, after I upload it to the gcr.io registry and run it as a job in cloud run. How do I authenticate?
Looking up authentication documentation (e.g. https://cloud.google.com/docs/authentication/provide-credentials-adc) is a like a maze and googling doesn’t turn up very useful results
2
Answers
On the GCP CloudRun it's quite simple, in the settings page for the Job you can select a Service Account to run the image as. As long as that service has the right access to bigquery then it should work.
For offline docker. Still not sure.
From a recent project a made… (All commands have an equivalent in the GCP Console UI, if you are more familiar with)
Assumptions:
Create the service account (SA) and deploy the docker image as a Run service ————–
The SA must have roles to access or edit the BQ dataset and table (ref.: https://cloud.google.com/bigquery/docs/control-access-to-resources-iam#grant_access_to_a_table_or_view)
You can download BQ config to a json file your dev machine…
For Dataset ———————————–
Then update the file to include the BQ roles.. (<SERVICE_ACCOUNT_EMAIL> replace with your SA email, <BQ_ROLE_1> and <BQ_ROLE_2> with your roles).:
Finally update the policy:
For Table ——————————–
Then update the file to include the BQ roles, eg.:
Finally update the policy:
In your service code (example made with NodeJS):