skip to Main Content

I’m implementing a workflow that I am fairly familiar with, deploying a web distribution to AWS using GitHub actions. The only thing that’s new here is that the project is built with Godot engine, so maybe that step is messing something up. By default, the AWS CLI is installed on ubuntu-latest.

Here’s my GitHub workflow:

jobs:
  export-web:
    name: Web Export
    runs-on: ubuntu-latest
    container:
      image: barichello/godot-ci:4.3
    steps:

      - name: Checkout
        uses: actions/checkout@v4
        with:
          lfs: true

      - name: Setup
        run: |
          mkdir -v -p ~/.local/share/godot/export_templates/
          mv /root/.local/share/godot/export_templates/${GODOT_VERSION}.stable ~/.local/share/godot/export_templates/${GODOT_VERSION}.stable

      - name: Web Build
        run: |
          mkdir -v -p build/web
          EXPORT_DIR="$(readlink -f build)"
          cd $PROJECT_PATH
          godot --headless --verbose --export-release "Web" "$EXPORT_DIR/web/index.html"

      - name: Upload Artifact
        uses: actions/upload-artifact@v4
        with:
          name: web
          path: build/web

      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1

      - name: Publish to AWS S3
        run: aws s3 cp --recursive ./build/web/ s3://${{ secrets.AWS_DISTRIBUTION_S3_BUCKET }}/

      - name: Invalidate AWS CloudFront distribution
        run: aws cloudfront create-invalidation --distribution-id ${{ secrets.AWS_CLOUDFRONT_DISTRIBUTION_ID }} --paths "/*"

And here’s the log of the failed step, "Publish to AWS S3"

Run aws s3 cp --recursive ./build/web/ s3://***/
  aws s3 cp --recursive ./build/web/ s3://***/
  shell: sh -e {0}
  env:
    GODOT_VERSION: 4.3
    EXPORT_NAME: bubble-up
    PROJECT_PATH: project
    AWS_DEFAULT_REGION: us-east-1
    AWS_REGION: us-east-1
    AWS_ACCESS_KEY_ID: ***
    AWS_SECRET_ACCESS_KEY: ***
/__w/_temp/0957ef75-e32d-4d86-a85d-b14ede405e83.sh: 1: aws: not found
Error: Process completed with exit code 127.

2

Answers


  1. From the documentation, runs-on: ubuntu-latest is only the specification for the virtual machine that runs the container barichello/godot-ci:4.3 and barichello/godot-ci:4.3 itself doesn’t have AWS CLI installed. You could try installing it manually in the configuration by adding the following step after Setup step

          - name: Install AWS CLI
            run: |
              apt-get -y update
              apt-get -y install python3-pip
              pip3 install awscli
    
    Login or Signup to reply.
  2. The error aws: not found indicates that the AWS CLI is not available within the container. Even though the container uses the barichello/godot-ci:4.3 image

    it may not have the AWS CLI installed by default.

    To fix this, add a step to install the AWS CLI before the "Publish to AWS S3" step:

    first intall:

    Install AWS CLI

    now run this code

    sudo apt-get update
    sudo apt-get install -y awscli
    

    Place this step right before the "Publish to AWS S3" step in your workflow. This ensures that the AWS CLI is available when running the aws s3 cp command.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search