I have a quartz job that runs once in 30 days in my ASP.NET application. The job loops through and passes the data to the API which in turn fetches JSON data. I am trying to store the data in a temporary table created in the MSSQL server using a stored procedure. But due to ~30 asynchronous requests the DB execution stops due to timeout error.
class RunJob : Ijob{
public void Execute
{
var start =1, end =100;
for(i=start ; i<end;i++)
{
GetAsyncFunction(i);
}
}
public async Task GetAsyncFunction(int i)
{
HttpClient client = new HttpClient();
HttpResponseMessage res= await client.GetAsync(url+"?param="i.ToString());
if (res.IsSuccessStatusCode){
//insert stringified json data to DB table
}
}
}
//call Execute() method in main
Is there a better approach to store the async calls and execute it sequentially. Thanks in advance.
2
Answers
It sounds like you created a new connection to the database with each request, but didn’t Dispose the connection after the request ended. Then the database will run out of her maximum database connection limit and start rejecting requests. Check if it is.
The code shows 100 concurrent async calls that are never awaited. There’s no database code but 100 concurrent calls is nothing for SQL Server. The code contains serious errors and leaks though
GetAsyncFunction
but never awaited. This means 100 HTTP call attempts are made to the same server at the same time. There’s a limit to how many concurrent calls can be made to the same endpoint, which is 2 in .NET Old (aka Framework).Quartz.NET supports asynchronous execution methods. From the Quick Start page:
This alone allows executing the 100 requests sequentially:
The
GetAsyncFunction
should just make the HTTP call and store the result. In this case I use Dapper to reduce the ADO.NET boilerplate to a single call. Dapper creates a parameterized query using the supplied object properties and takes care of opening and closing the connection:This will execute the requests sequentially. To execute multiple requests concurrently without starting all tasks at once, you can use
Parallel.ForeachAsync
in .NET 6 or egActionBlock
in previous versions to execute a specific number of requests at a time: