I have an app using:
- SAM
- AWS S3
- AWS Lambda based on Docker
- AWS SAM pipeline
- Github function
In the Dockerfile I have:
RUN aws s3 cp s3://mylambda/distilBERT distilBERT.tar.gz
Resulting in the error message:
Step 6/8 : RUN aws s3 cp s3://mylambda/distilBERT distilBERT.tar.gz
---> Running in 786873b916db
fatal error: Unable to locate credentials
Error: InferenceFunction failed to build: The command '/bin/sh -c aws s3 cp s3://mylambda/distilBERT distilBERT.tar.gz' returned a non-zero code: 1
I need to find a way to store the credential in a secured manner. Is it possible with GitHub secrets or something?
Thanks
2
Answers
Docker by default does not have access to the
.aws
folder running on the host machine. You could either pass the AWS credentials as environment variables to the Docker image:Keep in mind, hardcoding AWS credentials in a Dockerfile is a bad practice. In order to avoid this, you can pass the environment variables at runtime with using
docker run -e MYVAR1
ordocker run --env MYVAR2=foo
arguments. Other solution would be to use an.env
file for the environment variables.A more involved solution would be to map a volume for the
~/.aws
folder from the host machine in the Docker image.My solution may be a bit longer but I feel it solves your problem, and
Steps:
You can add the environment variables in Github actions(since you already mentioned Github actions) as secrets.
In your Github CI/CD flow, when you build the Dockerfile, you can create a aws credentials file.
Changing your credentials requires just changing your github actions.