I want to ask a question that comes my mind. This question about memory accessing, contains the object with singleton lifetime in asp.net core. So, suppose exists in this structure two thread. one of them is the normal request/response thread used in asp net. The other thread is continuous running worker service in background.
My plan is create the task queue. And in the queue, I’m storing tasks I don’t want execute in request/response thread. This stored functions is continuous executing in background.
This code partition’s contains to Task Queue. So this class using in background worker service and anywhere in asp.net.
public class EventQueue : IEventQueue
{
public LinkedList<Task> Queue = new LinkedList<Task>();
public void AddEvent(Task task)
{
Queue.AddFirst(task);
}
public Task GetNextEvent()
{
var task = Queue.Last.Value;
Queue.RemoveLast();
return task;
}
}
This code partition’s contains to worker service. It’s doing one by one execute in queue tasks
public class QueueWorker : BackgroundService
{
private readonly IEventQueue _queue;
public QueueWorker(IEventQueue queue)
{
_queue = queue;
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
while (!stoppingToken.IsCancellationRequested)
{
var task = _queue.GetNextEvent();
if (task != null)
task.RunSynchronously();
}
}
}
This code partition’s contains to registered services.
services.AddSingleton<IEventQueue,EventQueue>();
services.AddHostedService<QueueWorker>();
Questions:
- Does this structure work well? I think, It’s will not work well bcs there is multiple access to queue instance. Or rather worker service is always accessing to queue instance. Therefore there will be no time to access for other threads. So this approach’s right?
- if singleton lifetime wasn’t used and EventQueue was static(at least LinkedList property was static), Would things be different?
- Do you have any suggestions for the improvement of this structure?
3
Answers
For such task I would suggest taking a look at ConcurrentQueue collection.
This should provide a thread safe collection without need for any locks.
Alternatively You could use SemaphoreSlim object as a lock, if You cannot use aforementioned queue.
As far as I know, using ConcurrentQueue should remove the access block You mentioned. If You go with SemaphoreSlim, then I would suggest a small Task.Delay() in background service loop.
As for improvement, You could always try to remodel this service to make use of events, so when something is added to queue, background service will then start processing until nothing is left.
I will advice against the handling of tasks on a seperate queue thread with a worker backround something.
Instead consider using async Tasks, .net is really good at pushing through data as each task has an own thread anyway this structure frees up your response thread enough:
Now say you wanted a worker type object, does it have to be a singleton? i guess not, but regardless it can be injected, however singletons have to handle concurrent requests, they really have to be required to consider using.
what You get with async await pattern is that the thread is freed up to serve as soon as the task is created from factory and a wait handle is created and that is really flippin’ fast, only to be required again once the task has completed.
If you want a threadsafe queue consider using the ConcurrentQueue
the problem with the background worker, as i see it is that it is not a thread pool, unless it needs it, and a thread pool is already at Your disposal through the task.factory
After all if we really want a queue and not handle our work up front, wouldn’t we want it persistent, so that once controller has accepted request, it cannot be lost on server crash? then You need to consider another technology than one which lives and dies with the webserver.
Now you could have your webserver publishing on a topic on a persistent service bus which is subscribed to by your background worker only to be completed when processing actually succeeded … i think what You have in mind is something like this, but i fear that trying to build it into the webservers RAM storage will ultimately give you headaches.
There is Channel<T> class that is designed specifically for producer/consumer scenario like this. For your use case, say if you have a message class to send to your queue like:
You can register
Channel<Message>
,ChannelReader<Message>
andChannelWriter<Message>
as singletons:So now you can inject
ChannelWriter<Message>
to where you want to send message, andChannelReader<Message>
to your background service (via constructor, for example). YourExecuteAsync
can useChannelReader<T>.ReadAsync
to wait for message and process it accordingly.