Dec 11

Threading in C# Part -5

Threading in C# Part -5

Threading in C# Part -5

This is in the continuation of previous four Posts:
Threading in C# Part -1
Threading in C# Part -2
Threading in C# Part -3
Threading in C# Part -4

Thread Pooling

Whenever you start a thread, a few hundred microseconds are spent organizing such things as a fresh private local variable stack. Each thread also consumes (by default) around 1 MB of memory. The thread pool cuts these overheads by sharing and recycling threads, allowing multithreading to be applied at a very granular level without a performance penalty. This is useful when leveraging multicore processors to execute computationally intensive code in parallel in “divide-and-conquer” style.

The thread pool also keeps a lid on the total number of worker threads it will run simultaneously. Too many active threads throttle the operating system with administrative burden and render CPU caches ineffective. Once a limit is reached, jobs queue up and start only when another finishes. This makes arbitrarily concurrent applications possible, such as a web server. (The asynchronous method pattern is an advanced technique that takes this further by making highly efficient use of the pooled threads).

There are a number of ways to enter the thread pool:

  • Via the Task Parallel Library (from Framework 4.0)
  • By calling ThreadPool.QueueUserWorkItem
  • Via asynchronous delegates
  • Via BackgroundWorker

The following constructs use the thread pool indirectly:

  • WCF, Remoting, ASP.NET, and ASMX Web Services application servers
  • System.Timers.Timer and System.Threading.Timer
  • Framework methods that end in Async, such as those on WebClient (the event-based asynchronous pattern), and most BeginXXX methods (the asynchronous programming model pattern)

The Task Parallel Library (TPL) and PLINQ are sufficiently powerful and high-level that you’ll want to use them to assist in multithreading even when thread pooling is unimportant.
you can use the Task class as a simple means of running a delegate on a pooled thread.

There are a few things to be wary of when using pooled threads:

  • You cannot set the Name of a pooled thread, making debugging more difficult (although you can attach a description when debugging in Visual Studio’s Threads window).
  • Pooled threads are always background threads (this is usually not a problem).
  • Blocking a pooled thread may trigger additional latency in the early life of an application unless you call ThreadPool.SetMinThreads.

You are free to change the priority of a pooled thread — it will be restored to normal when released back to the pool.

You can query if you’re currently executing on a pooled thread via the property Thread.CurrentThread.IsThreadPoolThread.

Entering the Thread Pool via TPL

You can enter the thread pool easily using the Task classes in the Task Parallel Library. The Task classes were introduced in Framework 4.0: if you’re familiar with the older constructs, consider the nongeneric Task class a replacement for ThreadPool.QueueUserWorkItem, and the generic Task<TResult> a replacement for asynchronous delegates. The newer constructs are faster, more convenient, and more flexible than the old.

To use the nongeneric Task class, call Task.Factory.StartNew, passing in a delegate of the target method:

Task.Factory.StartNew returns a Task object, which you can then use to monitor the task — for instance, you can wait for it to complete by calling its Wait method.
The generic Task<TResult> class is a subclass of the non-generic Task. It lets you get a return value back from the task after it finishes executing. In the following example, we download a web page using Task<TResult>:

(The <string> type argument highlighted is for clarity: it would be inferred if we omitted it.)

Any unhandled exceptions are automatically re-thrown when you query the task’s Result property, wrapped in an AggregateException. However, if you fail to query its Result property (and don’t call Wait) any unhandled exception will take the process down.

The Task Parallel Library has many more features, and is particularly well suited to leveraging multicore processors.

Entering the Thread Pool Without TPL

You can’t use the Task Parallel Library if you’re targeting an earlier version of the .NET Framework (prior to 4.0). Instead, you must use one of the older constructs for entering the thread pool: ThreadPool.QueueUserWorkItem and asynchronous delegates. The difference between the two is that asynchronous delegates let you return data from the thread. Asynchronous delegates also marshal any exception back to the caller.

To use QueueUserWorkItem, simply call this method with a delegate that you want to run on a pooled thread:

Our target method, Go, must accept a single object argument (to satisfy the WaitCallback delegate). This provides a convenient way of passing data to the method, just like with ParameterizedThreadStart. Unlike with Task, QueueUserWorkItem doesn’t return an object to help you subsequently manage execution. Also, you must explicitly deal with exceptions in the target code — unhandled exceptions will take down the program.

Asynchronous delegates

ThreadPool.QueueUserWorkItem doesn’t provide an easy mechanism for getting return values back from a thread after it has finished executing. Asynchronous delegate invocations (asynchronous delegates for short) solve this, allowing any number of typed arguments to be passed in both directions. Furthermore, unhandled exceptions on asynchronous delegates are conveniently re-thrown on the original thread (or more accurately, the thread that calls EndInvoke), and so they don’t need explicit handling.

Here’s how you start a worker task via an asynchronous delegate:

  1. Instantiate a delegate targeting the method you want to run in parallel (typically one of the predefined Func delegates).
  2. Call BeginInvoke on the delegate, saving its IAsyncResult return value.
    BeginInvoke returns immediately to the caller. You can then perform other activities while the pooled thread is working.
  3. When you need the results, call EndInvoke on the delegate, passing in the saved IAsyncResult object.

In the following example, we use an asynchronous delegate invocation to execute concurrently with the main thread, a simple method that returns a string’s length:

EndInvoke does three things. First, it waits for the asynchronous delegate to finish executing, if it hasn’t already. Second, it receives the return value (as well as any ref or out parameters). Third, it throws any unhandled worker exception back to the calling thread.

Permanent link to this article:

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.