skip to Main Content

I have written a workflow file, that prepares the runner to connect to the desired server with ssh, so that I can run an ansible playbook.

ssh -t -v theUser@theHost shows me that the SSH connection works.

The ansible sript however tells me, that the sudo Password is missing.

If I leave the line ssh -t -v theUser@theHost out, ansible throws a connection timeout and cant connect to the server.

=> fatal: [***]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: ssh: connect to host *** port 22: Connection timed out

First I don’t understand, why ansible can connect to the server only if i execute the command ssh -t -v theUser@theHost.

The next problem is, that the user does not need any sudo Password to have execution rights. The same ansible playbook works very well from my local machine without using the sudo password. I configured the server, so that the user has enough rights in the desired folder recursively.

It simply doesn’t work form my GithHub Action.
Can you please tell me what I am doing wrong?

My workflow file looks like this:

name: CI

# Controls when the workflow will run
on:
  # Triggers the workflow on push or pull request events but only for the "master" branch
  push:
    branches: [ "master" ]

  # Allows you to run this workflow manually from the Actions tab
  workflow_dispatch:
  
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
  run-playbooks:
    runs-on: ubuntu-latest
    steps: 
      - uses: actions/checkout@v3
        with:
          submodules: true
          token: ${{secrets.REPO_TOKEN}}
      - name: Run Ansible Playbook
        run: |
         mkdir -p /home/runner/.ssh/
         touch /home/runner/.ssh/config
         touch /home/runner/.ssh/id_rsa
         echo -e "${{secrets.SSH_KEY}}" > /home/runner/.ssh/id_rsa
         echo -e "Host ${{secrets.SSH_HOST}}nIdentityFile /home/runner/.ssh/id_rsa" >> /home/runner/.ssh/config 
         ssh-keyscan -H ${{secrets.SSH_HOST}} > /home/runner/.ssh/known_hosts
         cd myproject-infrastructure/ansible
         eval `ssh-agent -s`
         chmod 700 /home/runner/.ssh/id_rsa
         ansible-playbook -u ${{secrets.ANSIBLE_DEPLOY_USER}} -i hosts.yml setup-prod.yml
      

2

Answers


  1. Chosen as BEST ANSWER

    Finally found it

    First basic setup of the action itself.

    
    name: CI
    
    # Controls when the workflow will run
    on:
      # Triggers the workflow on push or pull request events but only for the "master" branch
      push:
        branches: [ "master" ]
    
      # Allows you to run this workflow manually from the Actions tab
      workflow_dispatch:
    
    # A workflow run is made up of one or more jobs that can run sequentially or in parallel
    jobs:
    

    Next add a job to run and checkout the repository in the first step.

    jobs:
      run-playbooks:
        runs-on: ubuntu-latest
        steps: 
          - uses: actions/checkout@v3
            with:
              submodules: true
              token: ${{secrets.REPO_TOKEN}}
    
    

    Next set up ssh correctly.

       - name: Setup ssh
       shell: bash
            run: |
             service ssh status
             eval `ssh-agent -s`
    
    

    First of all you want to be sure that the ssh service is running. The ssh service was already running in my case.

    However when I experimented with Docker I had to start the service manually at the first place like service ssh start. Next be sure that the .shh folder exists for your user and copy your private key to that folder. I have added a github secret to my repository where I saved my private key. In my case it is the runner user.

             mkdir -p /home/runner/.ssh/
             touch /home/runner/.ssh/id_rsa
             echo -e "${{secrets.SSH_KEY}}" > /home/runner/.ssh/id_rsa
    
    

    Make sure that your private key is protected. If not the ssh service wont accept working with it. To do so:

             chmod 700 /home/runner/.ssh/id_rsa
    

    Normally when you start a ssh connection you are asked if you want to save the host permanently as a known host. As we are running automatically we can't type in yes. If you don't answer the process will fail.

    You have to prevent the process being interrupted by the prompt. To do so you add the host to the known_hosts file by yourself. You use ssh-keyscan for that. Unfortunately ssh-keyscan can produce output in differeny formats/types. Simply using ssh-keyscan was not enough in my case. I had to add other type options to the command. The generated output has to be written to the known_hosts file in the .ssh folder of your user. In my case /home/runner/.ssh/knwon_hosts

    So the next command is:

             ssh-keyscan -t rsa,dsa,ecdsa,ed25519 ${{secrets.SSH_HOST}} >> /home/runner/.ssh/known_hosts
    

    Now you are almost there. Just call the ansible playbook command to run the ansible script. I ceated a new step where I changed the directory to the folder in my repository where my ansible files are saved.

     - name: Run ansible script
            shell: bash 
            run: |
              cd infrastructure/ansible
              ansible-playbook --private-key /home/runner/.ssh/id_rsa -u ${{secrets.ANSIBLE_DEPLOY_USER}} -i hosts.yml setup-prod.yml
    
    

    The complete file:

    name: CI
    
    # Controls when the workflow will run
    on:
      # Triggers the workflow on push or pull request events but only for the "master" branch
      push:
        branches: [ "master" ]
    
      # Allows you to run this workflow manually from the Actions tab
      workflow_dispatch:
      
    # A workflow run is made up of one or more jobs that can run sequentially or in parallel
    jobs:
      run-playbooks:
        runs-on: ubuntu-latest
        steps: 
          - uses: actions/checkout@v3
            with:
              submodules: true
              token: ${{secrets.REPO_TOKEN}}
          - name: Setup SSH 
            shell: bash
            run: |
             eval `ssh-agent -s`
             mkdir -p /home/runner/.ssh/
             touch /home/runner/.ssh/id_rsa
             echo -e "${{secrets.SSH_KEY}}" > /home/runner/.ssh/id_rsa
             chmod 700 /home/runner/.ssh/id_rsa
             ssh-keyscan -t rsa,dsa,ecdsa,ed25519 ${{secrets.SSH_HOST}} >> /home/runner/.ssh/known_hosts
          - name: Run ansible script
            shell: bash 
            run: |
              service ssh status
              cd infrastructure/ansible
              cat setup-prod.yml
              ansible-playbook -vvv --private-key /home/runner/.ssh/id_rsa -u ${{secrets.ANSIBLE_DEPLOY_USER}} -i hosts.yml setup-prod.yml
    
    

    Next enjoy...


  2. An alternative, without explaining why you have those errors, is to test and use actions/run-ansible-playbook to run your playbook.

    That way, you can test if the "sudo Password is missing" is missing in that configuration.

    - name: Run playbook
      uses: dawidd6/action-ansible-playbook@v2
      with:
        # Required, playbook filepath
        playbook: deploy.yml
        # Optional, directory where playbooks live
        directory: ./
        # Optional, SSH private key
        key: ${{secrets.SSH_PRIVATE_KEY}}
        # Optional, literal inventory file contents
        inventory: |
          [all]
          example.com
    
          [group1]
          example.com
        # Optional, SSH known hosts file content
        known_hosts: .known_hosts
        # Optional, encrypted vault password
        vault_password: ${{secrets.VAULT_PASSWORD}}
        # Optional, galaxy requirements filepath
        requirements: galaxy-requirements.yml
        # Optional, additional flags to pass to ansible-playbook
        options: |
          --inventory .hosts
          --limit group1
          --extra-vars hello=there
          --verbose
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search