I’m looking for a way to create/generate an env file so that Docker Cloud Build or AWS Elastic Beanstalk (via CodePipeline) can pull them directly from the repository after a github action that generates an env file.
I know that most if not all hosting servers have their own way of managing environment variables, but I’m trying to figure out if there’s a way for it to be generated by github action before being pushed into AWS CodePipeline or Docker Cloud Build so that there’s only one file to touch when changing configs.
I’ve found that its mostly preferred that env variables are set-up manually on the servers for security purposes, but my target here is mostly for test servers, and self-hosted servers (docker) for personal use, so security is not top priority; but at the same time, I’m exploring to have it generated by github action (or something else, if there’s a different option that can support both Docker and AWS) instead of pushing the .env file directly to the repository.
I have a github action under .github/workflows/generate-env.yml
name: generate-env
on:
push:
branches:
- "test"
jobs:
generate_env:
runs-on: ubuntu-22.04
steps:
- name: Generate env file
run: |
touch .env
echo "${{ vars.APP_ENV }}" >> .env
cat .env
It generates and shows the .env contents correctly, but Docker and CodePipeline is not able to pull it
Looking forward to suggestions
2
Answers
Yes, it’s possible.
The code below generates an env file with github actions and pass it directly to the server
Once the file is on the server, it can be used by Docker or some script as environment variables.
The step "creates a .env file" assigns the value of the secret ENV_FILE to the environment variable ENV_FILE and then echoes this value into the .env file.
The next step uses
scp
to send the .env file to the remote server.The variables used have self-explanatory names. They must be added before in your Github repository settings:
This is the way to do it. You’re already storing your config settings in
vars.APP_ENV
. Just store them somewhere else, somewhere that easily integrates with your hosting.You can generate it there, but merely generating it in the temporary storage of a github action won’t automatically make it available to other build jobs that pull the same commit source from github.
Also, this means that you have to rerun the build jobs to change environment variables.
That’s why instead of storing it in
vars.APP_ENV
, you should store it in cloud storage somewhere convenient for your program to access it. Then you can just update your deployments or restart your servers.There can always be one "thing" to touch when changing configs. It doesn’t have to be
vars.APP_ENV
.My suggestion would be SSM Parameter Store from AWS. You’re already in AWS. https://repost.aws/knowledge-center/elastic-beanstalk-use-env-variables describes one way to import them at initialization time. This doesn’t seem like a particularly compelling approach to me. Other hosting options on AWS make this easier – Fargate might also be cheaper, and easier in other ways. I consider Elastic Beanstalk to be previous generation technology.
Another option would be to essentially move the github action to an initialization step in your Dockerfile’s entrypoint, whichwould subsequently invoke your program. You can grant Parameter Store access to the application’s role, then make a call to Parameter Store and redirect the output to your
.env
. This is simple and straight forward.