Recently I am working on a pipeline where in one stage I am reading values from a key value based file and setting them using : echo "##vso[task.setvariable variable=$key;isOutput=true]$value"
In my future job for instance if I read these values I am able to view them but when I am trying to pass these values to my template under parameters these are not getting pass and in the template execution I see on the console command not found.
I have read and found many solution which are referening to passing parameters like ${{parameters.varname }} but I don’t find an example related to passing dynamic variables to the template.
My main pipeline code is as below
trigger:
branches:
exclude:
- '*'
pr:
branches:
exclude:
- '*'
parameters:
- name: AWS_ACCESS_KEY_ID
displayName: AWS_ACCESS_KEY_ID
type: string
- name: AWS_SECRET_ACCESS_KEY
displayName: AWS_SECRET_ACCESS_KEY
type: string
- name: AWS_SESSION_TOKEN
displayName: AWS_SESSION_TOKEN
type: string
variables:
branchName: "$(Build.SourceBranchName)"
stages:
- stage: CheckBranch
displayName: "Check Branch"
jobs:
- job: IdentifyBranch
displayName: "Identify Branch Job"
pool:
vmImage: "ubuntu-latest"
steps:
- checkout: self
- task: CmdLine@2
name: VerifyBranchName
displayName: "Identify branch if it is master or other feature/ release/ hotfix/*"
inputs:
script: |
echo "Branch name is: $(Build.SourceBranchName)"
if [ "$(Build.SourceBranchName)" = "master" ] || [[ "$(Build.SourceBranchName)" == hotfix/* ]]; then
echo "This is a production deployment (master branch)"
echo "##vso[task.setvariable variable=isProduction;isOutput=true]true"
echo "##vso[task.setvariable variable=isValidBranch;isOutput=true]true"
elif [ "$(Build.SourceBranchName)" = "develop" ]; then
echo "This is a non-production deployment (develop branch)"
echo "##vso[task.setvariable variable=isProduction;isOutput=true]false"
echo "##vso[task.setvariable variable=isValidBranch;isOutput=true]true"
else
echo "This branch is not valid for deployment, $(Build.SourceBranchName)"
exit 0
fi
- stage: LoadEnvironment
displayName: "Load Environment Variables"
dependsOn: CheckBranch
jobs:
- job: LoadEnvVariables
displayName: "Load Environment Variables Job"
pool:
vmImage: "ubuntu-latest"
variables:
isProduction: $[ stageDependencies.CheckBranch.IdentifyBranch.outputs['VerifyBranchName.isProduction'] ]
isValidBranch: $[ stageDependencies.CheckBranch.IdentifyBranch.outputs['VerifyBranchName.isValidBranch'] ]
steps:
- task: CmdLine@2
name: LoadEnvironmentValues
displayName: "Loading values based on branch type"
inputs:
script: |
echo "Checking if it's a production branch..."
if [ "$(isProduction)" = "true" ]; then
echo "Production branch detected"
fi
echo "Checking if the branch is valid..."
if [ "$(isValidBranch)" = "true" ]; then
echo "Valid branch detected: $(isValidBranch)"
else
echo "This is not a valid branch, terminating flow."
exit 1
fi
cd aws/greengrass/
if [ "$(isProduction)" = "true" ]; then
echo "Loading production environment variables"
envFile="production.env"
else
echo "Loading non-production environment variables"
envFile="non-production.env"
fi
# Load environment variables and dynamically create parameters
while IFS= read -r line || [[ -n "$line" ]]; do
if [[ ! -z "$line" && "$line" != #* ]]; then
key=$(echo "$line" | cut -d '=' -f 1)
value=$(echo "$line" | cut -d '=' -f 2-)
echo "Setting parameter $key=$value"
echo "##vso[task.setvariable variable=$key;isOutput=true]$value"
fi
done < "$envFile"
- script: |
echo "stageDeps - $(stageDeps)"
echo "COMPONENT_VERSION - $(LoadEnvironmentValues.COMPONENT_VERSION)"
echo "GW_VERSION is $(LoadEnvironmentValues.GW_VERSION)"
echo "SSO_ACCOUNT_ID - $(LoadEnvironmentValues.SSO_ACCOUNT_ID)"
echo "AWS_REGION - $(LoadEnvironmentValues.AWS_REGION)"
echo "GG_BUCKET_NAME - $(LoadEnvironmentValues.GG_BUCKET_NAME)"
echo "KINESIS_STREAM - $(LoadEnvironmentValues.KINESIS_STREAM)"
echo "CORE_DEVICE - $(LoadEnvironmentValues.CORE_DEVICE)"
displayName: Check output variables
- template: greengrass-component-deployment-templ.yml
parameters:
COMPONENT_VERSION: $[ dependencies.LoadEnvVariables.outputs['LoadEnvironmentValues.COMPONENT_VERSION'] ]
GW_VERSION: $[ dependencies.LoadEnvVariables.outputs['LoadEnvironmentValues.GW_VERSION'] ]
SSO_ACCOUNT_ID: $[ dependencies.LoadEnvVariables.outputs['LoadEnvironmentValues.SSO_ACCOUNT_ID'] ]
AWS_REGION: $[ dependencies.LoadEnvVariables.outputs['LoadEnvironmentValues.AWS_REGION'] ]
GG_BUCKET_NAME: $[ dependencies.LoadEnvVariables.outputs['LoadEnvironmentValues.GG_BUCKET_NAME'] ]
KINESIS_STREAM: $[ dependencies.LoadEnvVariables.outputs['LoadEnvironmentValues.KINESIS_STREAM'] ]
CORE_DEVICE: $[ dependencies.LoadEnvVariables.outputs['LoadEnvironmentValues.CORE_DEVICE'] ]
AWS_ACCESS_KEY_ID: ${{ parameters.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ parameters.AWS_SECRET_ACCESS_KEY }}
AWS_SESSION_TOKEN: ${{ parameters.AWS_SESSION_TOKEN }}
NOTE: Update template code:
parameters:
- name: COMPONENT_VERSION
default: ''
- name: GW_VERSION
default: ''
- name: SSO_ACCOUNT_ID
default: ''
- name: AWS_REGION
default: ''
- name: GG_BUCKET_NAME
default: ''
- name: KINESIS_STREAM
default: ''
- name: CORE_DEVICE
default: ''
- name: AWS_ACCESS_KEY_ID
default: ''
- name: AWS_SECRET_ACCESS_KEY
default: ''
- name: AWS_SESSION_TOKEN
default: ''
stages:
- stage: CreateOrUpdateComponent
dependsOn: LoadEnvironment
variables:
AWS_ACCESS_KEY_ID: ${{parameters.AWS_ACCESS_KEY_ID}}
AWS_SECRET_ACCESS_KEY: ${{parameters.AWS_SECRET_ACCESS_KEY}}
AWS_SESSION_TOKEN: ${{parameters.AWS_SESSION_TOKEN}}
AWS_DEFAULT_REGION: ${{parameters.AWS_REGION}}
displayName: 'Flow for Greengrass Component Creation and Update'
jobs:
- job: greengrass_component_deployment
dependsOn: LoadEnvVariables
variables:
jobDeps: $[convertToJson(dependencies)]
COMPONENT_VERSION: ${{ parameters.COMPONENT_VERSION }}
GW_VERSION: ${{ parameters.GW_VERSION }}
SSO_ACCOUNT_ID: ${{ parameters.SSO_ACCOUNT_ID }}
AWS_REGION: ${{ parameters.AWS_REGION }}
GG_BUCKET_NAME: ${{ parameters.GG_BUCKET_NAME }}
KINESIS_STREAM: ${{ parameters.KINESIS_STREAM }}
CORE_DEVICE: ${{ parameters.CORE_DEVICE }}
steps:
- script: |
echo "deps - $(jobDeps)"
echo "COMPONENT_VERSION - $(COMPONENT_VERSION)"
echo "GW_VERSION is $(GW_VERSION)"
echo "SSO_ACCOUNT_ID - $(SSO_ACCOUNT_ID)"
echo "AWS_REGION - $(AWS_REGION)"
echo "GG_BUCKET_NAME - $(GG_BUCKET_NAME)"
echo "KINESIS_STREAM - $(KINESIS_STREAM)"
echo "CORE_DEVICE - $(CORE_DEVICE)"
# Job 1: Verify if the folder for greengrass component exist in S3 bucket
- job: VerifyS3FolderForGGComponent
displayName: 'Verify Folder for Greengrass Component'
pool:
vmImage: 'ubuntu-latest'
steps:
- script: |
echo "AWS_REGIN: ${{parameters.AWS_REGION}}"
echo "DevIoTDevice1 ${{parameters.CORE_DEVICE}}"
echo "sso account id : ${{parameters.SSO_ACCOUNT_ID}}"
# Task 1: Verify if the folder for greengrass component exists
- task: CmdLine@2
displayName: 'Verify if Greengrass Folder Exists'
inputs:
script: |
echo "Verifying if component folder exists in s3 Bucket..."
COMPONENT_FOLDER_EXISTS=$(aws s3api list-objects --bucket "${{ parameters.GG_BUCKET_NAME }}" --query 'Contents[].Key' --output text | grep -c "artifacts/com.test.dl.gs/${{ parameters.COMPONENT_VERSION }}") || echo "0"
echo "Component Exists: $COMPONENT_FOLDER_EXISTS"
echo "##vso[task.setvariable variable=COMPONENT_FOLDER_EXISTS]$COMPONENT_FOLDER_EXISTS"
echo "Verify s3 bucket component folder response : $COMPONENT_FOLDER_EXISTS"
- task: CmdLine@2
displayName: "When s3 bucket does not exist"
condition: eq(variables['COMPONENT_FOLDER_EXISTS'], '0')
inputs:
script: |
pwd
ls -lrtha
cd ./aws/greengrass
ls -ltrh ./artifacts/com.test.dl.gs/${{ parameters.COMPONENT_VERSION }}
echo "Creating the Artifact dir for the component ${{ parameters.GG_BUCKET_NAME }} "
#Pushing data to the s3 bucket
aws s3 sync ./artifacts/com.test.dl.gs/${{ parameters.COMPONENT_VERSION }} s3://${{ parameters.GG_BUCKET_NAME }}/artifacts/com.test.dl.gs/${{ parameters.COMPONENT_VERSION }}/
- task: CmdLine@2
displayName: "If component directory already exists..."
condition: ne(variables['COMPONENT_FOLDER_EXISTS'], '0')
inputs:
script: |
echo "Folder "artifacts/com.test.dl.gs/${{ parameters.COMPONENT_VERSION }}" already exists in bucket ${{ parameters.GG_BUCKET_NAME }}"
- template: greengrass-component-deployment-templ.yml
parameters:
COMPONENT_VERSION: $[ stageDependencies.LoadEnvironment.LoadEnvVariables.outputs['LoadEnvironmentValues.COMPONENT_VERSION'] ]
GW_VERSION: $[ stageDependencies.LoadEnvironment.LoadEnvVariables.outputs['LoadEnvironmentValues.GW_VERSION'] ]
SSO_ACCOUNT_ID: $[ stageDependencies.LoadEnvironment.LoadEnvVariables.outputs['LoadEnvironmentValues.SSO_ACCOUNT_ID'] ]
AWS_REGION: $[ stageDependencies.LoadEnvironment.LoadEnvVariables.outputs['LoadEnvironmentValues.AWS_REGION'] ]
GG_BUCKET_NAME: $[ stageDependencies.LoadEnvironment.LoadEnvVariables.outputs['LoadEnvironmentValues.GG_BUCKET_NAME'] ]
KINESIS_STREAM: $[ stageDependencies.LoadEnvironment.LoadEnvVariables.outputs['LoadEnvironmentValues.KINESIS_STREAM'] ]
CORE_DEVICE: $[ stageDependencies.LoadEnvironment.LoadEnvVariables.outputs['LoadEnvironmentValues.CORE_DEVICE'] ]
AWS_ACCESS_KEY_ID: ${{ parameters.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ parameters.AWS_SECRET_ACCESS_KEY }}
AWS_SESSION_TOKEN: ${{ parameters.AWS_SESSION_TOKEN }}
2
Answers
Finally after hunting my problem for hours and hours I finally figureout the problem thanks for hints and direction provided by @Alvin, I solve my problem. Only part change in my code is I move these parameter to stage level which were set in previous stage: My template calling is at stage level if you look into my question code.
Just a top level code snipt of my template pipeline.
Then I update my variable passing reference from ${{parameters.COMPONENT_VERSION}} to $(COMPONENT_VERSION).
Finally i remove the parameters like COMPNENT VERSION etc while calling the template.
Update
Based on the further discussions, below are the sample main pipeline YAML definition and template for your reference. Please modify the indentation for
- template: greengrass-component-deployment-templ.yml
inMainPipelineDefinition.yml
the same as that of- stage: LoadEnvironment
to help understand that you would prefer to output variables across stages rather than across jobs.Unlike parameters (such as
${{parameters.AWS_ACCESS_KEY_ID}}
) defined inMainPipelineDefinition.yml
that were processed at compile time before pipeline runtime, the template parameters like the${{parameters.AWS_REGION}}
defined ingreengrass-component-deployment-templ.yml
could only dynamically retrieve their values at pipeline runtime. Therefore, the script using compile time expression likeecho "AWS_REGIN: ${{parameters.AWS_REGION}}"
would not work as expected. We could declare new variables in downstream stage likeAWS_REGION: ${{ parameters.AWS_REGION }}
and use its macro syntax instead likeecho "AWS_REGION - $(AWS_REGION)"
. For further reference, please go through the document to Understand variable syntax.MainPipelineDefinition.yml
greengrass-component-deployment-templ.yml
From the current YAML definition indentation, there should be a separate job under the same stage
LoadEnvironment
as that of the jobLoadEnvVariables
ingreengrass-component-deployment-templ.yml
template. We should not usestageDependencies
to pass output variables across jobs in the same stage.Instead, we can use the expression like
$[ dependencies.LoadEnvVariables.outputs['LoadEnvironmentValues.COMPONENT_VERSION'] ]
according to the document on Levels of output variables. Here is a sample YAML pipeline for your reference.Sample greengrass-component-deployment-templ.yml