skip to Main Content

In Azure DevOps, I have a main pipeline consisting of two stages. In the first stage, I generate a YAML file named ‘abc.yaml’ that isn’t stored in the repository. In the subsequent stage of the main pipeline, I aim to execute the generated ‘abc.yaml’. Does Azure DevOps support this scenario? Can I trigger a pipeline which is not in the repository?

2

Answers


  1. Prior to execution, Azure Pipelines expands the templates and evaluates expressions, which is often referred to as compile-time expansion:

    To turn a pipeline into a run, Azure Pipelines goes through several steps in this order:

    1. First, expand templates and evaluate template expressions.
    2. Next, evaluate dependencies at the stage level to pick the first stage(s) to run.
    3. For each stage selected to run, two things happen:
      • All resources used in all jobs are gathered up and validated for authorization to run.
      • Evaluate dependencies at the job level to pick the first job(s) to run.
    4. For each job selected to run, expand multi-configs (strategy: matrix or strategy: parallel in YAML) into multiple runtime jobs.
    5. For each runtime job, evaluate conditions to decide whether that job is eligible to run.
    6. Request an agent for each eligible runtime job.

    So, in other words, the YAML is well known after the first step. You couldn’t generate a YAML file and then have the pipeline pick that up due to the authorization and validation activities that happen in step 3. (Eg, imagine someone approves a deployment and then the pipeline decides to generate other activities that require further approvals).

    However, Azure Pipelines has a very rich templating system that allows you to influence the construction of the yaml in the first step.

    Consider structuring your pipelines with templates with conditional insertion.

    For example:

    # pipeline.yml
    
    parameters:
    # This parameter is shown to the user when they run your pipeline
    - name: runAnalysis
      type: boolean
      default: false
    
    stages:
    - template: stages.yml
      parameters:
        type: web-app
        runAnalysis: ${{ parameters.runAnalysis }}
        
    
    # stages.yml
    
    # these parameters are supplied by the calling pipeline/template
    parameters:
    - name: type
      type: string
      default: web-app
      values:
        web-app
        sql-server
    
    - name: runAnalysis
      type: boolean
      default: false
    
    stages:
    
    - ${{ if eq( parameters.type, 'web-appp') }}:
      - stage: build
        jobs:
        - job: compile
          steps:
          - ...
        - ${{ if eq( parameters.runAnalysis }}:
          - job: sast_scan
            steps:
            - ...
    
      - stage: deploy
        ...
    
    - ${{ elseif eq( parameters.type, 'sql-server') }}:
      - ${{ if eq( parameters.runAnalysis, 'true' ) }}:
        - stage: terraform_plan
          ...
      - stage: terraform_apply
        ...
    

    It is also possible to nest templates:

    # stages
    
    parameters:
    
    stages:
    - ${{ if eq( parameters.type, 'web-app' ) }}:
      - template: web-app.yml
        parameters:
        ...
    
    - ${{ elseif eq( parameters.type, 'sql-server' ) }}:
      - template: sql-server.yml
        parameters:
        ...
    

    The above is a sample, but there are many different ways you could structure your templates.

    Login or Signup to reply.
  2. Just as @Jonathan and @bryanbcook mentioned, it is not possible to generate YAML template and execute its stages/jobs/steps in the same single pipeline, because the pipeline expansion is completed during compile time before runtime; therefore, we cannot generate the YAML file during runtime and then go back to compile time to modify the expansion.

    For this, would you consider separating the workflow into 2 pipelines as a workaround? You can create 2 pipelines; the downstream PipelineB is created based on the abc.yml in AnotherRepo, the upstream PipelineA is used to modify the abc.yml file and push it to AnotherRepo. Thus, whenever new commits on the abc.yml file are pushed to AnotherRepo, the CI trigger of PipelineB will fire and execute the steps in abc.yml.

    Here are the brief steps for your reference.

    1. Create PipelineB referencing abc.yml in AnotherRepo;
      enter image description here
    2. Create PipelineA to modify abc.yml and push the update to AnotherRepo;
      variables:
        TheProjectName: TheProjectName
        TheRepoName: AnotherRepo
        branchName: 'main'
        commitMessage: "Updating abc.yml in PipelineA - $(Build.BuildId)"
      
      stages:
      - stage: StageA
        jobs:
        - job: JobA
          displayName: Generate abc.yml and push to AnotherRepo
          steps:
            - checkout: git://$(TheProjectName)/$(TheRepoName)@refs/heads/$(branchName)
              persistCredentials: true
              displayName: Checkout resource from AnotherRepo
            - bash: |
                echo "#abc.yml
                trigger:
                - main #Make sure the trigger is enabled so that the CI trigger of PipelineB will fire
                stages:
                - stage: StageB
                  jobs:
                  - job: JobB
                    steps:
                    - script: echo "$(commitMessage)"
                      displayName: Run a one-line script" > abc.yml
              displayName: Update abc.yml
            - bash: |
                echo "1. git checkout to main branch in AnotherRepo"
                git checkout -b $(branchName)
                git config --global user.email "[email protected]"
                git config --global user.name "PipelineA"
                echo "2. git add"
                git add abc.yml
                echo "3. git commit"
                git commit -m "$(commitMessage)"
                echo "4. git push"
                git push origin $(branchName)
              displayName: Push abc.yml to AnotherRepo
      
      
    3. PipelineB will then be triggered by the commit pushed by PipelineA and execute the step from JobB under StageB in abc.yml
      enter image description here
    4. Since I use System.AccessToken of the pipeline service account in PipelineA to authenticate and push to AnotherRepo, it is required to confirm the corresponding pipeline service account has the permission to Contribute to AnotherRepo according to the job authorization scope;
      enter image description here

    Hope the workaround can meet your expectations.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search