skip to Main Content

I am trying to pass in Azure Pipeline Template JSON parameter as string but can’t figure out how to make this right so i could do json.loads in main pipeline to convert string to JSON.

Template pipeline:

trigger: none

resources:
  repositories:
  - repository: P1-T_automation
    type: git
    name: One/P1-T_automation
    ref: afat
    
parameters:
- name: accountvars
  type: object
  default: |
    {
    "hostgroup1": {"account1": ["hostname1", "hostname2", "hostname3"]},
    "hostgroup2": {"account2": ["hostname4", "hostname5", "hostname6"]}
    }

extends:
  template: Pipelines/main.yaml@P1-T_automation
  parameters:
      accountvars: ${{ parameters.accountvars }}

Follwing task - bash: echo "${{ parameters.accountvars }}" in main pipeline gives me output without double quotes (which are required for JSON format):

{
hostgroup1: {account1: [hostname1, hostname2, hostname3]},
hostgroup2: {account2: [hostname4, hostname5, hostname6]}
}

if i make object type of parameter string i get the same output.

if i escape every double quote like this:

- name: accountvars
  type: string
  default: |
    {
    "hostgroup1": {"account1": ["hostname1", "hostname2", "hostname3"]},
    "hostgroup2": {"account2": ["hostname4", "hostname5", "hostname6"]}
    }

i get good ech on bash task:

{
"hostgroup1": {"account1": ["hostname1", "hostname2", "hostname3"]},
"hostgroup2": {"account2": ["hostname4", "hostname5", "hostname6"]}
}

but after passing it to python script as argument:

- task: PythonScript@0
          displayName: Create fake data
          inputs:
            scriptSource: filePath
            scriptPath: $(Pipeline.Workspace)/self/Pipelines/scripts/datain.py
            arguments: ${{ variables.first_var }} $(topicName) ${{ parameters.accountvars }}
            workingDirectory: $(Pipeline.Workspace)
            pythonInterpreter: $(Pipeline.Workspace)/python_venv/bin/python

in python script print(sys.argv[3]) and print(str(sys.argv[3])) both gives me:

{
hostgroup1": {"account1": ["hostname1", "hostname2", "hostname3"]},
"hostgroup2": {"account2": ["hostname4", "hostname5", "hostname6"]}
}

Why do first escaped double quotes is replaced with ”?

I need to make it string one-liner with double quotes to make this fit in json.loads in python script.

How can i achieve this?

2

Answers


  1. Chosen as BEST ANSWER

    This did the work:

    - name: accountvars
      type: string
      default: >-
        "{
        'hostgroup1': {'account1': ['hostname1', 'hostname2', 'hostname3']},
        'hostgroup2': {'account2': ['hostname4', 'hostname5', 'hostname6']}
        }"
    

  2. As a workaround, you can try to use the expression "convertToJson" to convert the value of the object type parameter to be a JSON content.

    See bwlow example as reference:

    • azure-pipelines.yml
    parameters:
    - name: accountvars
      type: object
      default:
        hostgroup1:
          account1:
            - hostname1
            - hostname2
            - hostname3
        hostgroup2:
          account2:
            - hostname4
            - hostname5
            - hostname6
    
    jobs:
    - job: A
      displayName: 'Job A'
      steps:
      - task: Bash@3
        displayName: 'Print parameter'
        env:
          ACCOUNT_VARS: ${{ convertToJson(parameters.accountvars) }}
        inputs:
          targetType: inline
          script: echo "$ACCOUNT_VARS"
    
    • Result
      enter image description here

    On the PythonScript@0 task, you can use the same method to map it as an env (environment variable).

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search