[SOLVED] Measuring cpu time for http request in async controller


This Content is from Stack Overflow. Question asked by skyde

I was reading google SRE book chapter on handling overload : https://sre.google/sre-book/handling-overload/

They mention:

An interesting part of the puzzle is computing in real time the amount of resources—specifically CPU—consumed by each individual request. This computation is particularly tricky for servers that don’t implement a thread-per-request model, where a pool of threads just executes different parts of all requests as they come in, using nonblocking APIs.

I know in a thread-per-request model, we could simply call getrusage(RUSAGE_THREAD, &r);
but in a ASP.net Controller with async methods it’s not guaranteed that code before and after the “await” keyword will execute on the same thread.
and even if it does, it’s possible the thread also executed code for other http request.

So is there a way to measure how much cpu time an async function used.


Managed memory allocations will cause GC work that is done in other threads. Caches are managed in other threads.

Measuring whatever in direct dependency of a request is misleading.

If you have performance issues, you should collect metrics, ETW traces, memory dumps and analyze them.

This Question was asked in StackOverflow by skyde and Answered by Paulo Morgado It is licensed under the terms of CC BY-SA 2.5. - CC BY-SA 3.0. - CC BY-SA 4.0.

people found this article helpful. What about you?