I want to share the files between jobs in azure pipeline. I know that one method of doing this is using PublishPipelineArtifact Task. Is there any other method of doing the same?
I am using Azure DevOps Services and using a linux machine as a self hosted agent.
Thanks in Advance!
2
Answers
Alternative is PublishBuildArtifacts task
Reference link to see difference on PublishPipelineArtifact
& PublishBuildArtifact
It would be a good choice to use PublishPipelineArtifact task (or PublishBuildArtifacts task) to publish files as a named artifact. Then consume the files using DownloadPipelineArtifact task (or DownloadBuildArtifacts task) in other jobs/pipelines. Would you please let me know why you want to use other methods?
If you want to store and share large files, you can use Storage account. Run az storage blob upload in Azure CLI task to upload a file to a storage blob. Run az storage blob download to download files from storage account.
If you are running different jobs in the same pipeline and using the same self-hosted agent, you can define a shared path variable in your pipeline and store the files in it. This method has limitations and is not recommended. See more info from this ticket.
For example,
Copy your target files into the shared folder in one job. Take file1.zip for an example. Set
TargetFolder
to$(sharepath)
.Copy the files in the shared folder to the current working directory to consume. Set
SourceFolder
to$(sharepath)
.You can also use repo to store files and use checkout to consume.
There may be other methods to share files. You need to choose based on your actual needs.