skip to Main Content

I have a ForEachFile activity that get the files from GetMetaData.The files is json.
Within this ForEachFile activity I have a copy that should insert the contents into a sql server database table and in addition add the current filename as an extra column to the database table.
So I have google and find several example where they use the feature additional columns. So I thought that this should be easy.
So in the additional columns in the Source I add a dynamic column with Filename and value as @item.name because each iteration in the ForEachFile loop will have @item.name as the filename.Here is a screen shot for ther Source enter image description here

All the values in the json file is inserted into the database but the filename is empty.
If I create the table first and then run the pipeline and can see that the Filename column is null. If I let sink create the table I can see that no Filename column is created.

Mapping
enter image description here

When I have added this to the mapping enter image description here

I get this error enter image description here

Here is the input to activity copyJsonToSql. Note I have three of these because I have three files so the value of "additionalColumns" is different
{
"source": {
"type": "JsonSource",
"additionalColumns": "516e59b7-13cd-46e9-a0c7-1d04e16b4b00.json",
"storeSettings": {
"type": "AzureBlobStorageReadSettings",
"recursive": true,
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "JsonReadSettings"
}
},
"sink": {
"type": "SqlServerSink",
"writeBehavior": "insert",
"sqlWriterUseTableLock": false,
"tableOption": "autoCreate"
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"path": "[‘id’]"
},
"sink": {
"name": "id",
"type": "String"
}
},
{
"source": {
"path": "[‘direction’]"
},
"sink": {
"name": "direction",
"type": "String"
}
},
{
"source": {
"path": "[‘count’]"
},
"sink": {
"name": "count",
"type": "String"
}
},
{
"source": {
"path": "[‘date’]"
},
"sink": {
"name": "date",
"type": "String"
}
},
{
"source": {
"path": "[‘Filename’]"
},
"sink": {
"name": "Filename",
"type": "String"
}
}
],
"collectionReference": "$[‘counts’][‘count’]",
"mapComplexValuesToString": true
}
}

Here I have added the relevant activity in the pipeline
enter image description here

Now I get null as the filename in spite having added Filename as an addition columns as @item().name. I also used Add dynamic content and here I choose processeachJsonFile current item.
New screen shot enter image description here

Many thanks in advance
Tony Johansson

2

Answers


  1. Chosen as BEST ANSWER

    I manage to solve it as I indented in the first place. In the foreach activity I have a copy called CopyJsonToSql and after that a stored procedure. In the CopyJsonToSql I insert the data from the Json files into the db table and the column called filename will be null because there is no data in the jsonfile called Filename. In the stored procedure SaveFilenameToSql I pass the filename as @item().name. The stored procedure update ther database table with correct filename. The stored procedure look

    like this 
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    ALTER procedure [dbo].[p_CarCounter]
    @Filename varchar(100)
    as
    begin
        update dbo.CarCounter
        set Filename = @Filename
        where Filename is null;
    end
    

    • In order to copy the value of additional column to the sink table, you need to alter the mapping settings that is defined already.
    • In mapping tab of copy activity, click on Import schema.
    • All fields from source along with additional column got imported.
    • Map the additional column to the respective field in sink.
      enter image description here
    • Execute the pipeline, data will be copied to sink.
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search