skip to Main Content

I have a quartz job that runs once in 30 days in my ASP.NET application. The job loops through and passes the data to the API which in turn fetches JSON data. I am trying to store the data in a temporary table created in the MSSQL server using a stored procedure. But due to ~30 asynchronous requests the DB execution stops due to timeout error.

  class RunJob : Ijob{
    public void Execute
    {
      var start =1, end =100;
      for(i=start ; i<end;i++)
      {
        GetAsyncFunction(i);
      }
    }
    public async Task GetAsyncFunction(int i)
    {
        HttpClient client = new HttpClient();
        HttpResponseMessage res=  await client.GetAsync(url+"?param="i.ToString());
        if (res.IsSuccessStatusCode){
             //insert stringified json data to DB table
        }
    }
}
//call Execute() method in main

Is there a better approach to store the async calls and execute it sequentially. Thanks in advance.

2

Answers


  1. It sounds like you created a new connection to the database with each request, but didn’t Dispose the connection after the request ended. Then the database will run out of her maximum database connection limit and start rejecting requests. Check if it is.

    Login or Signup to reply.
  2. The code shows 100 concurrent async calls that are never awaited. There’s no database code but 100 concurrent calls is nothing for SQL Server. The code contains serious errors and leaks though

    • 100 calls are made to GetAsyncFunction but never awaited. This means 100 HTTP call attempts are made to the same server at the same time. There’s a limit to how many concurrent calls can be made to the same endpoint, which is 2 in .NET Old (aka Framework).
    • The remote service may not be able to handle 100 concurrent calls from the same client or may throttle them. All web servers (IIS, Apache, NGINX) put limits to ensure the server can handle high traffic without going down due to misbehaving clients
    • 100 new HttpClient instances are created which results in 100 new sockets that never close. HttpClient is thread-safe and meant to be reused. Using a single HttpClient instance will only use as many sockets as needed
    • There’s no database code, but if it’s similar to the network code, it probably means 100 connections are created but never closed, resulting in a 100-way lock conflict. INSERTs take locks on tables and indexes in all databases that aren’t compatible with other similar locks.

    Quartz.NET supports asynchronous execution methods. From the Quick Start page:

    public class HelloJob : IJob
    {
        public async Task Execute(IJobExecutionContext context)
        {
            await Console.Out.WriteLineAsync("Greetings from HelloJob!");
        }
    }
    

    This alone allows executing the 100 requests sequentially:

    class RunJob : Ijob
    {
        public async Task Execute(IJobExecutionContext context)
        {
            var dataMap = context.JobDetail.JobDataMap;
    
            var start =1, end =100;
            var baseUrl= dataMap.GetString("url");
            var cns= dataMap.GetString("ConnectionString");
    
            using var client=new HttpClient();
            for(i=start ; i<end;i++)
            {
                var url=$"{baseUrl}?param={i}";
                await GetAsyncFunction(client,url,cns);
            }
        }
    

    The GetAsyncFunction should just make the HTTP call and store the result. In this case I use Dapper to reduce the ADO.NET boilerplate to a single call. Dapper creates a parameterized query using the supplied object properties and takes care of opening and closing the connection:

    public async Task GetAsyncFunction(HttpClient client,string url, string cns)
    {
        var sql="INSERT INTO SomeTable (Url,Json) VALUES (@url,@json)";
        try
        {
            var json=await client.GetStringAsync(url);
            using var cn=new SqlConnection(cns);
            await cn.Exequte(sql,new {url,json});
        }
        catch(HttpRequestException ex)
        {
            //Handle any HTTP errors
            ...
        }
        catch(Exception ...)
        {
        ...
        }
    }
    

    This will execute the requests sequentially. To execute multiple requests concurrently without starting all tasks at once, you can use Parallel.ForeachAsync in .NET 6 or eg ActionBlock in previous versions to execute a specific number of requests at a time:

    public async Task Execute(IJobExecutionContext context)
    {
        var dataMap = context.JobDetail.JobDataMap;
    
        var start =1, end =100;
        var baseUrl= dataMap.GetString("url");
        var cns= dataMap.GetString("ConnectionString");
        var dop=10;
        var urls=Enumerable.Range(start,end-start)
                           .Select(i=>$"{baseUrl}?param={i}");
    
        using var handler=new HttpClientHandler {
            MaxConnectionsPerServer =20
        };
        using var client=new HttpClient(handler);
    
        var dop = new ParallelOptions()
        {
            MaxDegreeOfParallelism = dop
        };
    
        await Parallel.ForEachAsync(range dop async (url,ct)=>{
            await GetAsyncFunction(client,url,cns);
        }
    }
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search