skip to Main Content

I am searching for an architectural solution for dynamically running an Azure pipeline. Let’s assume that I will have a database containing information about various entities. Each entity holds essential information required to execute specific software. The number of entities will increase over time. Therefore, I would like to create a pipeline that reads data from the database (which is an easy step) and then generates an appropriate number of jobs based on that information. This is because each entity needs to run on a single VM. All the data is repeatable.

My initial idea was to create a new YAML template file and then override it in the first pipeline to trigger the appropriate actions. However, this approach seems somewhat inelegant, and I have concerns about its effectiveness.

Has anyone ever encountered a similar situation? Unfortunately, I am seeking a solution due to the limited number of pipelines that can be executed simultaneously.

2

Answers


  1. May be matrix jobs is the correct solution.
    You just need to build simple JSON in your data collection task.

    {
      "entity1": {
        "Key1": "Value1", 
        "Key2": "Value2"
      },
      "entity2": {
        "Key1": "Value3", 
        "Key2": "Value4"
      }
    }
    

    Output it as pipeline variable.
    https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#understand-variable-syntax
    Create dependency for the following "Matrix" to be the collection task and use the variables in the JSON as additional parameters.

    - job: collectionJob
      steps:
        - task: <Propertask>
          name: generator
    - job: doThings
      dependsOn: collectionJob
      condition: <check if the json is having content>
      strategy: 
        maxParallel: <define how many>
        matrix: $[ dependencies.collectionJob.outputs['generator.json'] ]
        # use arguments
        -Key1 $Key1
        -Key2 $Key2
    
        
    

    More info: https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/jobs-job-strategy?view=azure-pipelines

    Login or Signup to reply.
  2. I am afraid we are not able to achieve your needs through single Pipeline.

    The steps of obtaining the number of instances and setting the number of jobs in the Pipeline cannot be completed at the same time.

    Based on your requirement, I suggest that you can use two pipelines to achieve this. One pipeline is used to get the number of the instances and pass the value to another pipeline via Rest API: Runs – Run Pipeline , the other uses object type parameters to loop all instances and set the job numbers.

    Refer to the following steps:

    For the pipeline to get the number of the instances, you can get all instances names and pass them to another pipeline:

    For example:

    $token = "PAT"
    
    $url="https://dev.azure.com/{orgname}/{projectname}/_apis/pipelines/{PipelineID}/runs?api-version=5.1-preview"
    
    $token = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($token)"))
    
    $JSON = @'
    {
      
    
    
      "resources": {
        "repositories": {
          "self": {
            "refName": "refs/heads/master"
          }
        }
      },
      "templateParameters": {
        "instances":"[instance1,instance2,instance3]"
       },
    
    
    
    }
    '@
    
    $response = Invoke-RestMethod -Uri $url -Headers @{Authorization = "Basic $token"} -Method Post -Body $JSON -ContentType application/json
    

    For the pipeline to set the jobs, you can define the object parameters and loop all values.

    For example:

    pool:
      vmImage: ubuntu-20.04
    
    parameters:
      - name: instances
        type: object
        default: []
    
    jobs:
    - ${{ each instance in parameters.instances }}: 
      - job: 
        steps:
          - script: echo ${{ instance }}
    

    Result:

    When we pass 3 instances to the pipeline, it will create 3 jobs.

    enter image description here

    You can also refer to my another ticket.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search