Dataflow – Azure – isDecimal
I have been trying to do a datatype check in a decimal column of a file by using the Data Flow from Azure Data Factory, but it is not working as expected. My issue is the following one: I want…
I have been trying to do a datatype check in a decimal column of a file by using the Data Flow from Azure Data Factory, but it is not working as expected. My issue is the following one: I want…
I need to update the table which is in the on-premises sql server database using dataflow in azure data factory I have self hosted Integration Runtime Linked services created based on this self hosted Runtime Dataset connection works fine However…
I have an .exe application that outputs a file into the folder path you give as an input. The end goal is to get the output file into blob storage. I have considered azure functions but not sure if it…
There is the article about adding credentials in Azure Data Factory: https://learn.microsoft.com/en-us/azure/data-factory/credentials?tabs=data-factory Associate the user-assigned managed identity to the data factory instance using Azure portal, SDK, PowerShell, REST API. I am interested in PowerShell or REST API option, the process…
I am trying to replace single quote in a string with double quote using replace function with data factory expressions. For example, replace single quote in the following string hello'world ---> hello''world @replace(pipeline().parameters.tst,''','''') The above code is not working. Need…
I am getting error while I am reading Parquet file created by Databricks on ADLS. While I read these files using Databricks it works perfectly fine and I am able to read and write data into these files from Databricks.…
I have a two trigger synapse pipelines one which is scheduled at 03 am cst , What I'm looking now is the Second pipeline should trigger after the completion of the first pipeline i.e after 03 am cst. Is there…
I have a requirement, where I want to get the ADLS storage account name and use it in an activity. I am using Event based Trigger and I can get the container name, folder path and the file name from…
When copying a file from S3 to AzureBlobStorage, I would like to add the date and time string in addition to the source file name. In essence, the S3 folder structure looks like this data/yyyy/mm/dd/files *yyyy=2019-2022, mm=01-12, dd=01-31 And when…
I am creating Azure Data Factory pipeline using Python SDK (azure.mgmt.datafactory.models.PipelineResource). I need to convert PipelineResource object to JSON file. Is it possible anyhow? I tried json.loads(pipeline_object) , json.dumps(pipeline_object) but no luck.