skip to Main Content

We have an asp.net webform appliation. In A.aspx page, we have code as below.

    protected void Page_Load(object sender, EventArgs e)
    {
        if (IsPostBack)
        {
            Thread.Sleep(1000*30);
        }
    }

B.aspx is just a normal page without anything spceial. It is a very small page containing no behind code. After submit A.aspx, we know that worker thread in IIS will sleep 30 seconds. During this time span, we try to open B.aspx from the same browser, but this page also hang there. After A.aspx is finished, B.aspx is also finished very quickly.

It seems that IIS queues the requests from same IP same browser when the first reqeust is not finihsed. If we use two different browsers in the same computer, this thing does not happen.

We have adjust worker process in application pool, and also adjust the worker thread amount in each worker process. The issue is still there.

How can we adjust to aovid this issue? Thanks.

2

Answers


  1. You are confusing the concept of threading vs that of parallel processing (multi core CPU).

    Your browser, and in fact the web server can run on a single core CPU. So can the windows desktop.

    There are boatloads of issues here, but one is if a user makes more then one request, then which one is to finish first? Well, answer = the first request. If such requests were to come back in different orders, then say calling one web method without waiting for it to finish would result in your code running out of order, and in "most" cases, you need the routine/code you called first to finish first….

    Threading is usually referred to having multiple processes working at the same time on a single CPU (well actually not you think they do but they switch very fast between them).

    Parallelism is having multiple processes working at the same time on multiple CPU’s. So, multi-threading model is NOT the same as a parallelism and multi-core CPU’s….

    If you REALLY want to use another CPU core in your code behind on that web page? Then you can start + launch another separate processing thread, and one that will NOT be able to touch or modify the current web page.

    Compounding all of the above? If you grasp how the browser post-back and round-trip works? Then you RATHER quick realize that you can’t actually change this architecture and model.

    This means that the web "round trip" can’t be asynchronous to an ASP.NET web page round trip, since you have ONE round trip!!! The code behind must complete before the web page is to be returned. So, even if you call some asynchronous code, you have to await the code, and since your waiting, then the page round trip will NOT complete, and you wind up waiting for the server!!!

    Now, the above does not mean the code behind on that round trip (or a web method call) can’t create and start a WHOLE new separate CPU thread. However, the existing routine MUST wait, and if it does not, then the page life cycle completes, all code is terminated, and that web page is sent back to the client side.

    So, when the code behind "exits" or is "done", then the page is sent back to client side, and the code behind is terminated, and even the page class is disposed of WHEN you exit that page code.

    I can’t stress this concept of "when you exit that page" the code behind and page class is DESTROYED!!!! (unless you await, and if you do that, then the code not asynchronous anymore, is it????).

    Take this simple example:

    Markup:

            <asp:Button ID="Button1" runat="server" Text="test"
                OnClick="Button1_Click"
                />
    

    And code behind:

        protected void Button1_Click(object sender, EventArgs e)
        {
    
            Debug.Print("Will jump to About.aspx page");
            Response.Redirect("~/About.aspx");
    
            Debug.Print("done jumping to about page");
    
        }
    

    So, in above, the first debug.print runs, and THEN we decide to jump to another page. Thus, the current page is "terminated". Note the "error" which is correct, and then note how the 2nd debug.print code WILL NEVER run!!

    enter image description here

    So, the point of the above is?

    When your code behind is "done" or finished for the given web page then is the point in time that the web page travels back to the client. Thus, either you wait for code to complete in the given page, OR YOU LET THE PAGE travel back down to the client, and when you do that, then code behind for that page is terminated!

    The treading model is also much the same for a ajax call (from one browser client – they will que up, and wait, and run one after another). And again, without such a design, then attempts to write code would fast become VERY difficult.

    So, don’t confuse a threaded model with a parallel processing model – they are VAST different concepts.

    Either you let the page round trip complete and have the page sent back to the user, or you wait for the code behind to complete. If you let the page travel back to the client, then the server now has no page into which content can be written into, or even modified – the page is now back on the client side (the server does NOT hold a copy of the web page in server memory – it is disposed of). The server is now ready to accept another page – one posted by any user – not just the current user. Thus, the instant you let the page travel back to the client is the SAME instant that all code terminates for that page.

    So, a whole page post-back in fact can’t be, and is not asynchronous. It HAS to block the process until the new page is sent back from the server.

    ***If you want something that doesn’t block the processing of the request, it must be asynchronous with respect to the request. ***

    So, your page can make separate AJAX requests to the server to provide more data or whatever. However, once again, while this eliminates the round trip, and does occur asynchronous, such calls are still queued up due to the threaded model you using and not a parallel processing model. However, if those web methods respect the asynchronous model, then they should not in fact be queued up, but in fact both complete at the same time WHEN they are waiting for whatever resource to complete AND ALSO that the "whatever" to complete is correctly written with async in mind, or is an async process and NOT a processing limited task.

    So, if the web method code behind (server side) is written as non blocking, and is "waiting" for some resource that runs asynchronous?

    ***Then no such que should build up. ***

    Edit: Example code showing above

    Let’s take an example:

    We will call 2 web methods in the page. Each web method takes 4 seconds (of waiting — not processing!!!). In such a case, then no queue occurs, since both while on the SAME processor thread are BOTH waiting, and thus no que of processing exists.

    So, the markup is this:

    <asp:Button ID="Button1" runat="server" Text="call 2 web methods"
        OnClientClick="TestFun();return false;" />
    
    <br />
    Web1: 
    <asp:TextBox ID="txtWeb1" runat="server" ClientIDMode="Static">
    </asp:TextBox>
    <br />
    Web2: 
    <asp:TextBox ID="txtWeb2" runat="server" ClientIDMode="Static">
    </asp:TextBox>
    
    <script>
    
        function TestFun() {
    
            tb1 = $('#txtWeb1')
            tb2 = $('#txtWeb2')
            tb1.val('working')
            tb2.val('working')
    
            // call web 1
            $.ajax({
                type: "POST",
                url: "AJPostTest.aspx/Web1",
                data: {},
                contentType: "application/json; charset=utf-8",
                dataType: "json",
                success: function (response) {
                    tb1.val(response.d)
                },
                error: function (xhr, status, error) {
                    // Handle the error response
                    console.log(error, status, xhr);
                }
            });
    
            $.ajax({
                type: "POST",
                url: "AJPostTest.aspx/Web2",
                data: {},
                contentType: "application/json; charset=utf-8",
                dataType: "json",
                success: function (response) {
                    tb2.val(response.d)
                },
                error: function (xhr, status, error) {
                    // Handle the error response
                    console.log(error, status, xhr);
                }
            });
        }
    
    </script>
    

    And the 2 web methods on the web page are this:

        [WebMethod(enableSession:true)]
        public static string Web1()
        {
            Debug.Print("start 1");
            HttpContext.Current.Session["web1"] = "running";
            System.Threading.Thread.Sleep(4000);
            HttpContext.Current.Session["web1"] = "done";
            Debug.Print("start 1 done");
            return "Web 1 done";
        }
        [WebMethod(enableSession:true)]
        public static string Web2()
        {
            Debug.Print("start 2");
            HttpContext.Current.Session["web2"] = "running";
            System.Threading.Thread.Sleep(4000);
            HttpContext.Current.Session["web2"] = "done";
            Debug.Print("start 2 done");
            return "Web 2 done";
        }
    

    So, when we run the above, if they ARE queue up, then the above should take 8 seconds, but as you can see, it ONLY takes 4 seconds.

    We see this:

    enter image description here

    Note how above takes only 4 seconds. This is due to the web method code not being processing bound. (it’s waiting for disk, or SQL server, or in this case sleep). Since the code is not processing bound, then we enjoy the threaded model. However, what occurs if the example code was not waiting for some resources (the fake sleep), but in fact was processing bound? In the processing bound case, then the 2 web calls WOULD AND WILL take 8 seconds to run. In that case, we could bring back the time down to 4 seconds by introduction of a new processor thread OUTSIDE of the current thread! In other words, create a new thread in your code behind, and that WILL use additional CPU cores if available.

    Login or Signup to reply.
  2. You are getting delay in response because of a lock is placed on session for same user(using sessionid).
    When you use another browser or create new session in chrome and access page b then it will come immediately.

    I took the same sample provided by Albert, the first request takes 4s to respond and second one is taking 8 sec to respond.
    Intrestingly the very first time both request are completing in 4s, but subsequent request are taking 4s and 8s respectively.

    No matter howmany cores you have in your PC. You can check my code

    note, we can disable session at page level, then have implemented the same web methods , both responded in 4s.

    Hope it helps someone to not use session state unnecessarily.

    Timedelay

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search