skip to Main Content

come here looking for some help.

i have a resource graph explorer that normally generates a 900kb .csv / 5k lines of information.

i have embeeded the queri inside a powershell runbook to be executed in an Automation account and then sent the .csv file attached in an email. the thing is that i’m tryin to export the file this way:

    $result = Search-AzGraph -Query $query

if ($result -eq $null -or $result.Data -eq $null) {
    throw "No data returned from the query. Please check the query and try again."
}

$csvFilePath = "$env:TEMPResourcesInventory.csv"

try {

    $result.Data | Export-Csv -Path $csvFilePath -NoTypeInformation
    Start-Sleep -Seconds 30 

    if (-Not (Test-Path $csvFilePath)) {
        throw "CSV file not found: $csvFilePath"
    }

this ways works the only thing is that the .csv file generated is only 19kb / 101 lines and i haven’t find a way to make sure the result is exported complete.

i have tried also to send the result into an storage account as follows:

    # Execute the query
$result = Search-AzGraph -Query $query

# Convert result to CSV content
$csvContent = $result.Data | ConvertTo-Csv -NoTypeInformation

# Upload CSV content to Azure Storage Account
$StorageAccountName = "saresourceinventory"
$ContainerName = "resourceinventorycontainer"
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storagekey

$blobName = "ResourcesInventory.csv"

Set-AzStorageBlobContent -Container $ContainerName -Blob $blobName -BlobType Block -Context $StorageContext -File $blobName -Force  

but it seems i need to have the file exported first to be uploaded to the storage account.

Do you guys have any other idea to solve this?

thanks for your help

2

Answers


  1. To upload the file to a storage account, you could do like this:

    The code below saves the file temporarily to disk, and then uploads it to azure. Finally, the temporary file is removed, and your file upload is complete 🙂

    # Getting the result & converting to csv
    $result = Search-AzGraph -Query $query
    $csvContent = $result.Data | ConvertTo-Csv -NoTypeInformation
    
    # Write file to "disk" 
    $csvContent | Out-File -FilePath './temporaryFile.json' -Encoding utf8
    
    # Upload CSV content to Azure Storage Account
    $StorageAccountName = "saresourceinventory"
    $ContainerName = "resourceinventorycontainer"
    $blobName = "ResourcesInventory.csv"
    
    # Get storage context and upload the csv file
    $StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storagekey
    Set-AzStorageBlobContent -File './temporaryFile.json' -Container $ContainerName -Blob $blobName -Context $StorageContext -Force  
    
    # Remove the temp file
    Remove-Item -Path './temporaryFile.json'
    

    Good luck!

    Login or Signup to reply.
  2. yep, i already tried that, the thing is that, the thing is that when i export the file to a temporary path it is only 19kb, i can’t get to print a bigger file 🙁

    In my environment, when I used Search-AzGraph, the generated .csv file also contained only 101 lines.

    File:

    enter image description here

    According to this MS-Document,

    You can use the below script that Search-AzGraph command is used within a while loop to fetch data in batches of five records per request, utilizing the skipToken parameter to handle pagination.

    Script:

    $kqlQuery = "Resources | join kind=leftouter (ResourceContainers | where
    type=='microsoft.resources/subscriptions' | project subscriptionName = name, subscriptionId) on
    subscriptionId | where type =~ 'Microsoft.Compute/virtualMachines' | project VMResourceId = id,
    subscriptionName, resourceGroup, name"
    
    $batchSize = 5
    $skipResult = 0
    
    [System.Collections.Generic.List[string]]$kqlResult
    
    while ($true) {
    
    if ($skipResult -gt 0) {
        $graphResult = Search-AzGraph -Query $kqlQuery -First $batchSize -SkipToken $graphResult.SkipToken
      }
      else {
        $graphResult = Search-AzGraph -Query $kqlQuery -First $batchSize
      }
    
    $kqlResult += $graphResult.data
    
    if ($graphResult.data.Count -lt $batchSize) {
        break;
      }
      $skipResult += $skipResult + $batchSize
    }
    
    $csvFilePath="$env:TEMPsample.csv"
    
    $kqlResult | Export-Csv -Path $csvFilePath -NoTypeInformation
    
    
    $StorageAccountName = "venkat326"
    $ContainerName = "test"
    $blobName = "ResourcesInventory.csv"
    $storagekey = "xzzzzzz"
    
    # Get storage context and upload the csv file
    $StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storagekey
    Set-AzStorageBlobContent -File "$csvFilePath" -Container $ContainerName -Blob $blobName -Context $StorageContext -Force
    Remove-Item -Path $csvFilePath
    

    Output:

    The above code executed and Uploaded 392 lines of csv file to Azure Blob Storage.
    enter image description here

    Reference:

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search