skip to Main Content

I am using ASP.NET as a backend to an application to update some data in a separate system’s SQL Server.

I have a query that needs to insert many rows to a table. In each row, the values being inserted are derived from parameters. To protect against SQL injection attacks, I previously used the SqlCommand object and added parameters like so:

string query = @"
    INSERT INTO [dbo].[table_name] (id, type, value) VALUES (@id, 10, @value1);
    INSERT INTO [dbo].[table_name] (id, type, value) VALUES (@id, 15, @value2);
    INSERT INTO [dbo].[table_name] (id, type, value) VALUES (@id, 20, @value3);
    ...
";

using (SqlConnection connection = new SqlConnection(ConfigurationManager.ConnectionStrings["context"].ConnectionString))
{
    SqlCommand sql_command = new SqlCommand(query, connection);
    sql_command.Parameters.AddWithValue("@id", id);
    sql_command.Parameters.AddWithValue("@value1", value1);
    sql_command.Parameters.AddWithValue("@value2", value2);
    sql_command.Parameters.AddWithValue("@value3", value3);
    ...
    sql_command.ExecuteNonQuery();
}

This worked well, but for concurrency’s sake I would like to execute these in a batch using SQL’s "GO" statement. This way, the inserts could not be interrupted by another process locking access to this table. Since SqlCommand.ExecuteNonQuery does not support the "GO" statement, I decided to look into ServerConnection.ExecuteNonQuery instead (from Microsoft’s SQL Management Objects package). This was promising, as it supports "GO" statements, but I can’t find a way to add parameters to my query when executing. According to Microsoft’s docs here, the only parameters allowed for the ExecuteNonQuery method do not contain a parameter list.

Does anyone know how I can achieve this goal? Is it necessary that I use the "GO" statement? Or is there another way of achieving this kind of "batched" execution with the standard SqlCommand.ExecuteNonQuery?

2

Answers


  1. Chosen as BEST ANSWER

    Following Ralf's suggestion, I edited the insert to look like the following. Sometimes I guess we can just overthink these things :)

    INSERT INTO [dbo].[table_name] (id, type, value)
    VALUES (@id, 10, @value1), (@id, 15, @value2), (@id, 20, @value3) ...;
    

  2. GO is not actually part of the SQL language. Rather, it’s a feature commonly supported among various development tools. Sometimes you can even change it to a different word.

    Worse, even when you use it, it’s not going to improve concurrency for this operation.

    What you can do is manage the concurrency in your app. So if the values came from a List/IEnumerable, you could do something like this extremely naïve example:

    public async void InsertRecords(int id, IEnumerable<string> valuesToInsert)
    {
        // setup query to use batches of size n
        string query = @"
            INSERT INTO [dbo].[table_name] (id, type, value) VALUES (@id, 10, @value1);
            INSERT INTO [dbo].[table_name] (id, type, value) VALUES (@id, 15, @value2);
            INSERT INTO [dbo].[table_name] (id, type, value) VALUES (@id, 20, @value3);
            ...
            INSERT INTO [dbo].[table_name] (id, type, value) VALUES (@id, N*5, @valueN);
        ";
        
        var batches = valuesToInsert.Chunk({your batch size here});
        var tasks = batches.Select(async b => {
            using var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["context"].ConnectionString);
            using var cmd = new SqlCommand(query, connection);
        
            // AVOID ADDWITHVALUE()!
            cmd.Parameters.Add("@id", SqlDbType.Int).Value = id;
            for(int i = 0; i < b.Length; i++)
            {
                // making up the parameter type. Use what's actually in the DB!
                cmd.Parameters.Add($"@value{i+1}", SqlDbType.NVarChar, 20).Value = b[i];
            }
        
            await connection.OpenAsync();
            await cmd.ExecuteNonQueryAsync();
        });
        var results = await Task.WhenAll(tasks);
    }
    

    BUT!

    Even with this, a set of inserts to the same table will probably cause database locking in a way that forces this to be a serial operation! It also leaves you open to a partial success/failure for the operation, making rollbacks and troubleshooting more difficult.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search