skip to Main Content

I created an azure automation account in powershell to get the names of the files and the sizes that are in certain containers.

I’m running that code via Azure Data Factory with Webhook activity.

The code is running very well for for small/medium sized containers.

The problem is that when I try to run the code for a certain container that already has a large amount of files, it tries 3 times and gets suspended and nothing happens. And I saw the logs and I saw this message:

The runbook job failed due to sandbox running out of memory. Each Azure Automation sandbox is allocated 400 MB of memory. The job was attempted 3 times before it was suspended. See some common ways to resolve this at https://aka.ms/AAMemoryLimit

Does anyone know how to resolve this situation or is it possible to increase the memory? Thanks !

PS code:

#define parameters
param (
    [Parameter (Mandatory = $false)]
    [object] $WebHookData,
    [string]$StorageAccountName,
    [string]$StorageAccountKey
)


$Parameters = (ConvertFrom-Json -InputObject $WebHookData.RequestBody)

<#If ($Parameters.callBackUri)
{
    $callBackUri = $Parameters.callBackUri
}#>


$containerName = $Parameters.containerName
"->"+$StorageAccountName
"->"+$StorageAccountKey


$connectionName = "AzureRunAsConnection"
try
{
    # Get the connection "AzureRunAsConnection "
    $servicePrincipalConnection=Get-AutomationConnection -Name $connectionName       

    "Logging in to Azure..."
    Connect-AzAccount `
        -ServicePrincipal `
        -TenantId $servicePrincipalConnection.TenantId `
        -ApplicationId $servicePrincipalConnection.ApplicationId `
        -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
}
catch {
    if (!$servicePrincipalConnection)
    {
        $ErrorMessage = "Connection $connectionName not found." 
        throw $ErrorMessage 
    } else{
        Write-Error -Message $_.Exception 
        throw $_.Exception 
    }
}


#storage account
$StorageAccountName = $StorageAccountName

#storage key
$StorageAccountKey = $StorageAccountKey

#Container name - change if different
$containerName = $containerName

#get blob context
$Ctx = New-AzStorageContext $StorageAccountName -StorageAccountKey $StorageAccountKey 

# get a list of all of the blobs in the container 
$listOfBlobs = Get-AzStorageBlob -Container $containerName -Context $Ctx 

# zero out our total
$length = 0 

# this loops through the list of blobs and retrieves the length for each blob
#   and adds it to the total
$listOfBlobs | ForEach-Object {$length = $length + $_.Length} 

# output the blobs and their sizes and the total 
Write-Host "List of Blobs and their size (length)"
Write-Host " " 
$select = $listOfBlobs | Select-Object -Property @{Name='ContainerName';Expression={$containerName}}, Name, @{name="Size";expression={$($_.Length)}}, LastModified 
#$listOfBlobs | select Name, Length, @{Name='ContainerName';Expression={$containerName}}
Write-Host " "
Write-Host "Total Length = " $length

#Define location and Export to CSV file
$SourceLocation = Get-Location 

$select | Export-Csv $SourceLocation'File-size/File-size-'$containerName'.csv' -NoTypeInformation -Force -Encoding UTF8 

$Context = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey

Set-AzureStorageBlobContent -Context $Context -Container "Name" -File $SourceLocation"File-size/File-size-$containerName.csv" -Blob "File-Size/File-size-$containerName.csv" -Force

2

Answers


  1. I had the same situation once too…

    You can’t increase the memory of the runtime. You could download a hybrid worker agent (or however it is renamed now) to make the azure automation account run your script on your local machine or a server you set up.

    Your memory problem is caused by $listOfBlobs = Get-AzStorageBlob ...

    You can not pull them all at once, you must use paging (or in other words, list in batches)

    $MaxReturn = 10000
    $ContainerName = "abc"
    $Total = 0
    $Token = $Null
    do
     {
         $Blobs = Get-AzStorageBlob -Container $ContainerName -MaxCount $MaxReturn  -ContinuationToken $Token
         $Total += $Blobs.Count
         if($Blobs.Length -le 0) { Break;}
         $Token = $Blobs[$blobs.Count -1].ContinuationToken;
     }
     While ($null -ne $Token)
    
    Echo "Total $Total blobs in container $ContainerName"
    

    Source -> https://learn.microsoft.com/en-us/powershell/module/az.storage/get-AzStorageblob?view=azps-8.2.0#example-4-list-blobs-in-multiple-batches

    Login or Signup to reply.
  2. By default, runbooks execute on Azure Sandboxes which are MS-managed machines and have a certain resource quota implemented per account/execution as fair use policy.

    For resource-hungry and long-running jobs, it’s always recommended to user Hybrid Worker mode of execution.

    In this mode, you can add machines as a HYBRID WORKER NODE( can be on-prem or cloud VM ) which have sufficient resources to execute your workload.

    You can refer my blog here to know more about how to configure hybrid worker node and local/cloud machines for workload execution.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search