skip to Main Content

I have an eBay account. To make updating my products easily, I created a windows service in C#. It waits (by FileSystemWatcher) for an xml file and once a file is appeared, a service reads it and sends requests to eBay server through their API. A file might hold about 1000-3000 rows.

Previously I used to create 12 threads to make it faster. I don’t know why I chose 12 threads, I thought that was enough amount: not too much and not too little.

So here is a method how I looked like before (bit dirty, 12 should be a constant)

private static void UpdateAsync(IEnumerable<ProductInfo> products)
        {
            ProductInfo[] productsInfo = items.ToArray();
            Thread[] threads = new Thread[12];
            bool sendPartialy = productsInfo .Length > 12;
            int delta = 0;
            for (int i = 0; i < productsInfo.Length; i++)
            {
                if (sendPartialy )
                {
                    if (i != 0 && i % 12 == 0)
                    {
                        WaitForExecutedThreads(threads);
                        delta = 12 * (i / 12);
                    }
                }

                ProductInfo product = ProductInfo[i];
                threads[i - delta] = new Thread(_ => product.UpdateItem(context));
                threads[i - delta].Start();
            }

            WaitForExecutedThreads(threads);
        }

Then they told me that using threads are not necessary because of only one network interface: it’s narrow to allow 12 https requests be executed at the same time and thus, each of a thread will be waiting for another one.

That’s why I decided not to use multi-threading at all and just use a simple requests like that:

    var itemInfoLights = new List<ItemInfoLight>();
    foreach (var p in productsInfo )
    {
       p.UpdateItem(context);
    }

I know, it’s too bad to send all request separately because each of them has their own headers, etc… That’s not efficient.

Well, what is the right approach to do that?

5

Answers


  1. I’m not sure what the **only one** network interface part means. Sounds like some bad advice.

    Even browsers typically issue 5 to 10 simultaneous requests when pulling down a web page in order to grab the various parts (css, javascript, images, the page itself..). So, having 12 threads ought to be just fine.

    Regarding the headers etc, you can’t avoid that for individual requests. The only other way is if the API allows you to send multiple requests in a single transaction. I’m not familiar with the eBay API, however I would imagine that they do support it.

    You might look at this question for a hint: CompleteSale Ebay API for multiple items

    Login or Signup to reply.
  2. When I do something like this I use the asynchronous HttpRequest methods and it works for me. Look into HttpWebRequest.BeginGetResponse and HttpWebRequest.BeginGetRequestStream.

    Also more info : http://msdn.microsoft.com/en-us/library/86wf6409(VS.71).aspx

    I think this is the right approach

    Login or Signup to reply.
  3. As I guess you want maximum efficiency so I advice you to use .Net Thread Pool. There is ThreadPool class which will allocate maximum threads from the pc. There is many example over the web.

    one of them here. msdn with example

    ThreadPool.QueueUserWorkItem(YourRequestFunctionName, parameterObject);

    Login or Signup to reply.
  4. Since you don’t need the CPU during this network operation, creating new threads isn’t the optimal choice. (When CPU is the actual issue, you generally want only 1 thread per processor. So, 2 or 4 threads would be ideal.)

    If you had an async version of the UpdateItem method, you could call it as many times as you have items to update, then handle the completion in a callback function. This would be much more efficient, since you would not be starting threads that don’t really do anything, but you would be executing multiple requests at nearly the same time.

    In the event that you don’t have an async version of the method, using threads will be the only easy way to get side-by-side update calls. You might use ThreadPool.QueueUserWorkItem instead. That takes advantage of threads which are already available, avoiding the cost of spinning up new threads every time.

    Login or Signup to reply.
  5. The cost of starting a thread is insignificant compared to the time to execute a web service call. What’s worked well for me is to Queue a Thread instance for each request you want to send using a worker method that invokes a callback once the web service call completes. Threads only consume resources when they’re started, not when they’re just instantiated.

    I then use a supervisor class that dequeues a thread, starts it, and then adds it to a Dictionary keyed by the threads’ ManagedThreadId. This is repeated until you reach your maximum number of twelve concurrent connections. As each thread finishes, it invokes the callback method back to the supervisor, which then removes the thread from the Dictionary and launches another queued thread. Once the Queue and Dictionary are empty you’re done.

    If you wire a UI app to your service you can adjust the thread limit while monitoring your throughput and the queued and active threads so you can play with it in real-time.

    You should also set the maxconnection property in your app.config to something like 100 to give you room to experiment.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search