I was reading google SRE book chapter on handling overload : https://sre.google/sre-book/handling-overload/
They mention:
An interesting part of the puzzle is computing in real time the amount of resources—specifically CPU—consumed by each individual request. This computation is particularly tricky for servers that don’t implement a thread-per-request model, where a pool of threads just executes different parts of all requests as they come in, using nonblocking APIs.
I know in a thread-per-request model, we could simply call getrusage(RUSAGE_THREAD, &r);
but in a ASP.net Controller with async methods it’s not guaranteed that code before and after the "await" keyword will execute on the same thread.
and even if it does, it’s possible the thread also executed code for other http request.
So is there a way to measure how much cpu time an async function used.
2
Answers
Found one Nuget this provides this feature.
https://github.com/devizer/Universe.CpuUsage
Using the class CpuUsageAsyncWatcher
Managed memory allocations will cause GC work that is done in other threads. Caches are managed in other threads.
Measuring whatever in direct dependency of a request is misleading.
If you have performance issues, you should collect metrics, ETW traces, memory dumps and analyze them.