I have a database delta file (.sql) and it contains sql statements such as Insets, Delete, Update. The database is setup with the tables and data, I am simply trying to execute these updates
I am trying to find a method to implement a solution in Azure ADF to read from this file in blob storage and execute these statements on the MySQL database. Particularly using the Copy Data activity because it is resource intensive and I also want to not have activity calls for each statement. I would greatly appreciate some insight into if this is possible?
My file contains around 28 million inserts. I am hoping to be able to utalise the Copy Data activity to achieve this.
Thank you
I have an example of the start of the file, the rest follows similar statements.
BEGIN TRANSACTION;
INSERT INTO FILE(sha256,sha1,md5,crc32,file_name,file_size,package_id) VALUES('000005A9BB818F8B3493DDCED88D1789679046E84ABE71EC3A79A1ED4DE815E2','30330D39A84E2B7B240F1DD8904AC5DAA1DD263C','3DD77458C9D1C38CF5880E08816CAEFE','66058703','.rela.text.frida_socket_host_session_provider_host_entry_class_init',1416,298522);
INSERT INTO FILE(sha256,sha1,md5,crc32,file_name,file_size,package_id) VALUES('000005A9BB818F8B3493DDCED88D1789679046E84ABE71EC3A79A1ED4DE815E2','30330D39A84E2B7B240F1DD8904AC5DAA1DD263C','3DD77458C9D1C38CF5880E08816CAEFE','66058703','.rela.text.frida_socket_host_session_provider_host_entry_class_init',1416,298553);
INSERT INTO FILE(sha256,sha1,md5,crc32,file_name,file_size,package_id) VALUES('000005A9BB818F8B3493DDCED88D1789679046E84ABE71EC3A79A1ED4DE815E2','30330D39A84E2B7B240F1DD8904AC5DAA1DD263C','3DD77458C9D1C38CF5880E08816CAEFE','66058703','.rela.text.frida_socket_host_session_provider_host_entry_class_init',1416,298615);
INSERT INTO FILE(sha256,sha1,md5,crc32,file_name,file_size,package_id) VALUES('000005A9BB818F8B3493DDCED88D1789679046E84ABE71EC3A79A1ED4DE815E2','30330D39A84E2B7B240F1DD8904AC5DAA1DD263C','3DD77458C9D1C38CF5880E08816CAEFE','66058703','.rela.text.frida_socket_host_session_provider_host_entry_class_init',1416,299558);
INSERT INTO FILE(sha256,sha1,md5,crc32,file_name,file_size,package_id) VALUES('0000070B8C662EABB12B94C87CFD44DF7EA729B92C03B9F2336BA226D0A4D87C','5DF28838F14488AE757A26E8D294B8557F0DDA92','D2D35205949F5DF7D068AE05143CBA1F','3F045301','STRINGSSV.JS',14324,298858);
INSERT INTO FILE(sha256,sha1,md5,crc32,file_name,file_size,package_id) VALUES('0000070B8C662EABB12B94C87CFD44DF7EA729B92C03B9F2336BA226D0A4D87C','5DF28838F14488AE757A26E8D294B8557F0DDA92','D2D35205949F5DF7D068AE05143CBA1F','3F045301','STRINGSSV.JS',14324,298866);
2
Answers
Copy data is transfers data from file to database. It will not execute the SQL statements from file on MySQL database.
For this you can see my answer here to execute statements from
.sql
file to Azure MySQL databaseThe other workaround can be importing file to MySQL from storage account FileShare as below :
Navigate to your Azure Database for MySQL server on the portal and Run Azure Cloud Shell (Bash). If you run this for the first time it will ask you to create a Storage container, and this will mount an Azure File in it.
Type df in the cloud shell and collect the Azure File path
Upload SQL file in storage accounts file share you got in bash commands output here it is under File System "cs110032001d078f3ab", navigate on the Azure Portal to that file system, as in the following screenshot and upload your SQL file.
cd /usr/csuser/clouddrive
which I got in bash commands output.Output:
With a little bit of editing of that file, you could use
LOAD DATA
to insert the data. Removeand
To get a valid line for a suitably written
LOAD DATA
to read.