What is the correct way of repository implementation in EF Core?
public IAsyncEnumerable<Order> GetOrder(int orderId)
{
return blablabla.AsAsyncEnumerable();
}
or
public Task<IEnumerable<Order>> GetOrder(int orderId)
{
return blablabla.ToListAsync();
}
Is it performance wise to call AsAsyncEnumerable()
? Is this approach safe?
From one hand it doesn’t create List<T>
object so it should be slightly faster. But from the order hand the query is not materialized so we deffer the SQL execution and the result can change in the meantime.
2
Answers
According to source
.ToListAsync
will useIAsyncEnumerable
internally anyway, so there’s not much of performance benefits in one or another.But one important feature of
.ToListAsync
or.ToArrayAsync
is cancellation.List will basically hold everything in memory but it might be a serious performance concern only if the list is really big. In this case you might consider paging your big response.
The decision really comes down to whether you wish to buffer or stream.
If you want to buffer the results, use
ToList()
orToListAsync()
.If you want to stream the results, use
AsEnumerable()
orAsAsyncEnumerable()
.From the docs:
In general, it’s best to stream, unless you need to buffer.
When you stream, once the data is read, you can’t read it again without hitting the DB again. So if you need to read the same data more than once, you’ll need to buffer.
If a repository streams a
IEnumerable
, the caller could choose to buffer it by callingToList()
(orToListAsync()
onIAsyncEnumerable
). We lose this flexibility if the repository chooses to return an IList.So to answer your question, you’re better off to the repository stream the result. And let the caller decide if they want to buffer.
If the team working on the project is not comfortable with stream semantics, or if most of the code already buffers, it might make sense to suffix the methods that stream with something like
AsStream
(eg.GetOrdersAsStream()
) so that they know they shouldn’t be enumerating it more than once.So a repository could have: