skip to Main Content

Suppose I got a csv file like this:

field1 field3
0      1
1      2

The corresponding table looks like this:

field1 field2 field3
null   null   null
...    ...    ...

Suppose all the fields are nullable, how could I import the csv file into the table? I know I could BULK INSERT with a formatter, but I cannot generate formatter by bcp cuz the connection to the local SQL server failed (weird!) Is there any better way to solve this problem?

2

Answers


  1. You could create a staging table with only the columns that you have in your data csv file, and then use BCP to load data into that. After that use INSERT INTO ...SELECT ..FROM to load data to your target table.
    see this answer for more details.

    Login or Signup to reply.
  2. Working with csv/txt delimited files can be tricky when moving them over into a SQL Server table. I’ve had users mess up columns or have too many columns etc. My solution was to first use stream reader to read the file then add the header row into an array, placing the data itself into its own array. Then I loop through the header array and add each value into a column from a empty DataTable.
    So now I have a data table that holds the header names (which represent the Column names in SQL Server) and another data table holding the actual data.
    Then query the SQL server Table to get a dictionary list of the Column names:

            Dictionary<int, string> SQLTableColNamesDict = new Dictionary<int, string>();
    
            string Command = " SELECT COLUMN_NAME, ORDINAL_POSITION FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'table name' ";
            using (SqlConnection Connection = new SqlConnection(sqlconnectionstring))
            {
                Connection.Open();
                using (SqlCommand cmd = new SqlCommand(Command, Connection))
                {
                    using (SqlDataReader reader = cmd.ExecuteReader())
                    {
                        while (reader.Read())
                        {
                            SQLTableColNamesDict.Add((int)reader[1], (string)reader[0].ToString().ToUpper());
                        }
                    }
                }
            }
    

    Then loop through the header array and see if the dictionary list matches the header value, if I have a match use Bulk Copy to map the Column.

                 using (SqlBulkCopy bulkCopy = new SqlBulkCopy(sqlconnectionstring))
            {
                bulkCopy.DestinationTableName = SQLdestinationTable;
    
                bulkCopy.BatchSize = dtWithColNames.Rows.Count;
    
                foreach (string columnFromFile in firstRowHeaderFromFileArray)
                {
    
    
    
                    string DesintationOrdinalPostion = "";
                    string DesintationColName = "";
    
                    if (SQLTableColNamesDict.ContainsValue(columnFromFile.ToUpper()))
                    {
                        DesintationOrdinalPostion = SQLTableColNamesDict.First(item => item.Value == columnFromFile.ToUpper()).Key.ToString();
    
                        DesintationColName = SQLTableColNamesDict.First(item => item.Value == columnFromFile.ToUpper()).Value.ToString();
    
                    }
    
    
                    if (DesintationOrdinalPostion != "")
                    // if (colFound != null)
                    {
                        SqlBulkCopyColumnMapping col = new SqlBulkCopyColumnMapping();
    
    
                        // col.SourceColumn = columnFromFile;
    
                        col.SourceColumn = DesintationColName;
    
                        col.DestinationOrdinal = Convert.ToInt32(DesintationOrdinalPostion);
                        col.DestinationColumn = columnFromFile.ToUpper();
    
    
                        bulkCopy.ColumnMappings.Add(col);
    
    
                    }
    
                }
    

    Then finally do the bulk copy

                     int recordCount = 0;
                recordCount = dtWithColNames.Rows.Count;
    
                // Write from the source to the destination.
                try
                {
                    bulkCopy.WriteToServer(dtWithColNames);
    
    
    
                }
                catch (Exception error)
                {
    
                }
                finally
                {
    
                    bulkCopy.Close();
    
                }
    

    This should allow the transfer to work even if the csv file is perhaps malformed with extra columns and only take in valid columns which match your SQL table.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search