come here looking for some help.
i have a resource graph explorer that normally generates a 900kb .csv / 5k lines of information.
i have embeeded the queri inside a powershell runbook to be executed in an Automation account and then sent the .csv file attached in an email. the thing is that i’m tryin to export the file this way:
$result = Search-AzGraph -Query $query
if ($result -eq $null -or $result.Data -eq $null) {
throw "No data returned from the query. Please check the query and try again."
}
$csvFilePath = "$env:TEMPResourcesInventory.csv"
try {
$result.Data | Export-Csv -Path $csvFilePath -NoTypeInformation
Start-Sleep -Seconds 30
if (-Not (Test-Path $csvFilePath)) {
throw "CSV file not found: $csvFilePath"
}
this ways works the only thing is that the .csv file generated is only 19kb / 101 lines and i haven’t find a way to make sure the result is exported complete.
i have tried also to send the result into an storage account as follows:
# Execute the query
$result = Search-AzGraph -Query $query
# Convert result to CSV content
$csvContent = $result.Data | ConvertTo-Csv -NoTypeInformation
# Upload CSV content to Azure Storage Account
$StorageAccountName = "saresourceinventory"
$ContainerName = "resourceinventorycontainer"
$StorageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storagekey
$blobName = "ResourcesInventory.csv"
Set-AzStorageBlobContent -Container $ContainerName -Blob $blobName -BlobType Block -Context $StorageContext -File $blobName -Force
but it seems i need to have the file exported first to be uploaded to the storage account.
Do you guys have any other idea to solve this?
thanks for your help
2
Answers
To upload the file to a storage account, you could do like this:
The code below saves the file temporarily to disk, and then uploads it to azure. Finally, the temporary file is removed, and your file upload is complete 🙂
Good luck!
In my environment, when I used
Search-AzGraph
, the generated .csv file also contained only 101 lines.File:
According to this MS-Document,
You can use the below script that
Search-AzGraph
command is used within awhile
loop to fetch data in batches of five records per request, utilizing theskipToken
parameter to handle pagination.Script:
Output:
The above code executed and Uploaded
392
lines of csv file to Azure Blob Storage.Reference: