skip to Main Content

I have one Azure storage table in DEV environment with 100 rows (example). I would like to move the data and table as it is from DEV to SIT then UAT goes like to PROD.
so basically its like lift and shift of the table with data.

Is there any solution available in Azure portal or commands etc?

I heard with Azure Copy we can achieve this but not sure with respect to tables

Example let’s say we have Employee table with columns as emp id, name, address, contact,email etc in my DEV environment and this table being used as Read purpose in the code
when my code move to SIT and UAT similarly I should move this dependency data to SIT and UAT
so here i need to move Employee table with all records and structure from one environment to another

2

Answers


  1. As mentioned by @Quentin Geff , you can use Azure data factory to move data from one table another table. For sample purpose i have taken 3 rows in your case you can use 100 rows also.

    • Created two table storages as source and destination.

    • Created 2 linked services in adf monitor tool for source and destination tables as shown in below image.
      enter image description here

    • To create linked service, Go to azure monitor tool>Linked services>> + New>>Azure table storage>> fill the required details like linked service name,subscription and storage account name and select Create.

    • In your case source will be the DEV environment and destination will be uat or sit environment.

    • Then you need to create 2 data sets for these 2 linked services.

    • To create Data Sets Got to Author tool>>Data sets >> New dataset>>select Azure table storage>>Select the created LS

    enter image description here

    • Then in pipeline take copy activity. Select dev environment data set as source and uat or sit environment as Sink.
      enter image description here

    Configure sink as shown below, to copy partition key same as source table.
    enter image description here

    • Run pipeline and data will be copied to uat or sit environment.
      enter image description here
      enter image description here

    • You can use another copy activity in same pipeline and configure source as uat/sit table and sink as prod environment.

    Login or Signup to reply.
  2. If you want to do the data migration yourself in code you could export all data to CSV (BLOB Storage or local file storage) and then import it to the target table either by code or Azure Storage Explorer.

    I wrote a little library to make the CSV export easier:

    using Azure.Data.Tables;
    using Medienstudio.Azure.Data.Tables.CSV;
    
    TableServiceClient tableServiceClient = new(connectionString);
    TableClient tableClient = tableServiceClient.GetTableClient("tablename");
    
    // Export all rows from the table to a CSV file
    using StreamWriter writer = File.CreateText("test.csv");
    await _tableClient.ExportCSVAsync(writer);
    
    // Export all rows as CSV to Azure BLob Storage
    BlobContainerClient containerClient = new(BlobConnectionString, "testcontainer");
    var blobClient = containerClient.GetBlobClient("test.csv");
    var stream = await blobClient.OpenWriteAsync(true, new BlobOpenWriteOptions() { HttpHeaders = new BlobHttpHeaders { ContentType = "text/csv" } });
    using StreamWriter writer = new(stream);
    await _tableClient.ExportCSVAsync(writer);
    
    // Import all rows from a CSV file to the table
    using StreamReader reader = new("test.csv");
    await _tableClient.ImportCSVAsync(reader);
    

    Source

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search