Threading

.Threading
Thread Objects | Foreground/Background and Priority | Passing Data | Local Data | Shared Data | Tread Pools

"Threads are paths of execution within an application which are allocated time slices for the CPU. Shared memory and context switching are added complexities when using multi-threads in an application. .Net provides features for helping deal with the added complexities of multi-threading. However multi-threading should be used judiciously and coded carefully because of its capability to create intermediate and unreproducible bugs."

Threads resemble operating system processes but have some fundamental differences. Processes run in parallel on a computer while threads run in parallel within a process. However processes are isolated from each other while threads share memory with each other. Sharing memory can make threads useful, but it also introduces complexities in keeping the data synchronized. Switching between threads (context switching) also introduces added complexities. There are a number of features in .NET that help programmers deal with these complexities.

In this day of multi-core processors thread can help to fully utilize the CPU processing power, if they are used correctly. Threading requires the CPU to handle creating, scheduling, switching, and destroying the threads. There are situations where the overhead involved in using threads can actually slow the process down, such as when processing is bound by a single source (e.g. I/O operations) or in a workflow where each stage depends entirely upon results from the previous stage. Before using multiple threads the benefits should be weighted against the added complexities and additional resource usage. Using multiple threads requires good designing and careful coding. Unintended interaction between threads can cause intermittent and non-reproducible bugs. Staying with proven designs (when possible) and keeping threading interaction to a minimum can help reduce bugs. It can also be useful to put multithreading logic into reusable classes which can be tested independently.

According to Microsoft Practices and Patterns, some scenarios where threading should be considered include:

  1. When communicate over a network, for example to a Web server, database, or remote object.
  2. When time-consuming local operations can cause the UI to freeze.
  3. To improve the performance of application startup and initialization
  4. To distinguish tasks of varying priority.

Included in these categories are network communications such as Web service calls, HTTP requests, remote object calls, RPC, DCOM, and distributed transcactions. Local services in these categories include: image rendering, data manipulation, data sorting, and searching. Also time-critical tasks can be put on a thread with a higher priority and application initialization can be put on a separate thread to increase its responsiveness.




Thread Objects

A .NET program starts in a single "main" thread that is created by the runtime and operating system. It will remain a single-threaded application unless additional threads are introduced (either directly or indirectly). Use of some .NET technologies with indirectly introduce multi-threading (such as WCF and ASP.NET), however a way to directly create a thread is with the Thread class. The Thread class creates and controls a thread; it sets the threads priority and gets its status. A thread is instantiated with a delegate and is started by calling the Thread.Start() method. The following example program creates a thread and then writes to the screen concurrently from both the new thread and the main thread. The Thread.Join() method caused the main thread to wait for the new thread to finish before continuing execution. A lambda expression is used for the delegate.

.Threads Concurrently Writing to Screen
Concurrently Writing to Screen from Two Threads

Note how the threads are preempted as the Console class synchronizes the use of the output stream so it can be written from multiple threads. Each thread is allocated a time slice to perform work. Coding Thread.Sleep(0) will cause the thread to relinquish it current time slice immediately. The newer Thread.Yield performs a similar function, except it only relinquishes to threads running on the same processor. These commands can be useful when testing programs to uncover bugs. The Thread.Name property is also useful when debugging as the name is displayed inside Visual Studio's debugger.

using System.Threading;

namespace ThreadingExample1
{
    class Program
    {
        static public int mainCount = 0;
        static public int threadCount = 0;
        static public int totalCount = 0;

        static void Main()
        {
            Thread t = new Thread(() =>
            {
                for (int i = 0; i < 900; i++)
                {
                    System.Console.Write("t");
                    threadCount++;
                    totalCount++;
                }
            });

            t.Start();
            for (int i = 0; i < 500; i++)
            {
                System.Console.Write("M");
                mainCount++;
                totalCount++;
            }
            t.Join();
            System.Console.WriteLine("\n\nMain Count   : {0,4}", mainCount);
            System.Console.WriteLine("Thread Count : {0,4}", threadCount);
            System.Console.WriteLine("Total Count  : {0,4}\n", totalCount);
        }
    }
}

Top




Foreground/Background and Priority

"Threads created explicitly are foreground threads. Foreground threads keep the application alive, while background threads do not. Background threads are best used for non-critical functions that are no longer needed once the main task has finished."

The Thread.IsBackground property is used to query and set the thread's background/foreground status. Foreground threads keep the application alive, while background threads do not. By default, once all the foreground threads have finished, all the background threads which are still running are abruptly terminated. However the foreground can be directed to wait for background threads with the Thread.Join() method. If background threads have the possibility of never finishing, or are taking too long to finish, the Thread.Abort method can be used as a time-out to stop a thread. This method is not recommended since it can leave a corrupt state and possibly make the application unusable. You can specify a time limit with the Thread.Join() method. For example, Thread.Join(1000) would wait for one second and then continue. However this does not automatically stop the thread. The preferred method for stopping a thread after a certain period of time is through the use of a shared variable where one thread can signal another thread to stop.

A Thread's priority helps to determine how much execution times a thread receives in relation to other active threads. It is ultimately up to the CLR to determine how much time each thread is allocated, but threads can provide hints to the CLR as to the desire allocation priorities. That is, a thread assigned the highest priority is not guaranteed to get more time that lower priority threads. For example, the CLR may be in the middle of performing tasks so it may determine it is more important to finish its work than to allocate extra time to other threads. Be aware that a consequence of increasing the priority on some threads may prevent lower-priority threads from executing at their required levels. In most situations the threads priority is best left in it's default value of Normal. The thread priorities are enumerated as:

  1. Lowest
  2. BelowNormal
  3. Normal
  4. AboveNormal
  5. Highest

The following program creates three threads each with a different name and priority. One thread is created as a foreground thread and the other two threads are created as background threads. The threads run in an loop until the enter key is pressed, after which a shared variable is set which signals the processes to stop. Once the threads are stopped, the increment counters are printed as shown by the screen capture. Also shown is the thread window from Visual Studio while debugging the application.

.Thread Priorities
Threads With Different Priority Settings

using System;
using System.Threading;

namespace ThreadingExample2
{
    class Program
    {
        volatile static bool stopped = false;

        static void Main()
        {
            double[] count = new double[4];
            ThreadPriority[] priority = new ThreadPriority[4]
            {
               ThreadPriority.BelowNormal,
               ThreadPriority.Lowest,
               ThreadPriority.Normal,
               ThreadPriority.Highest
            };

            // Assign the main thread a name
            Thread.CurrentThread.Name = "KevinMain";

            // Threads
            Thread t1 = new Thread(() => { while (!stopped) count[1]++; Thread.Sleep(0); });
            Thread t2 = new Thread(() => { while (!stopped) count[2]++; Thread.Sleep(0); });
            Thread t3 = new Thread(() => { while (!stopped) count[3]++; Thread.Sleep(0); });

            t1.Name = "Kevin Thread 1";
            t1.Priority = priority[1];
            t1.IsBackground = false;
            System.Console.WriteLine("T1 is: {0}", (t1.IsBackground ? "Background" : "Foreground"));

            t2.Name = "Kevin Thread 2";
            t2.Priority = priority[2];
            t2.IsBackground = true;
            System.Console.WriteLine("T2 is: {0}", (t2.IsBackground ? "Background" : "Foreground"));

            t3.Name = "Kevin Thread 3";
            t3.Priority = priority[3];
            t3.IsBackground = true;
            System.Console.WriteLine("T3 is: {0}", (t3.IsBackground ? "Background" : "Foreground"));

            t1.Start(); t2.Start(); t3.Start();
            Console.ReadKey();
            stopped = true;
            t1.Join(); t2.Join(); t3.Join();

            Console.WriteLine("\nT1 Priority: {0}  Count: {1,12}", priority[1], count[1]);
            Console.WriteLine("T2 Priority: {0}  Count: {1,12}", priority[2], count[2]);
            Console.WriteLine("T3 Priority: {0} Count: {1,12}\n", priority[3], count[3]);
        }
    }
}

Top




Passing Data

Data can be passed to threads in the Thread.Start() method or by using lambda expressions. The Start() method can only pass in one object and can not return a value. Lambda expressions can call the method with the desired parameters and return a value (via closure). However lambda expression's closure, a feature for capturing variable values, can also result in unexpected results in the variable values.

Passing Parameter with Tread.Start()

The Thread.Start() method can accept only one parameter of type object. If passing a single primitive value, it will need to be cast before it is used. For passing more complex values a class needs to be created to pass into the thread. The following example shows how a class containing two data values can be passed into threads.

/***************************************************
* Passing a parameter to a thread using an object *
***************************************************/
using System;
using System.Threading;

namespace ThreadExample4
{
    class Program
    {
        public class Pet
        {
            public string Type { get; set; }
            public string Name { get; set; }
            public override string ToString()
            {
                return Name + " the " + Type;
            }
        }

        static void Main(string[] args)
        {
            Thread t1 = new Thread(myMethod);
            Thread t2 = new Thread(myMethod);
            t1.Start(new Pet() {Type = "Cat", Name = "Shadow" });
            t2.Start(new Pet() { Type = "Holly", Name = "Dog" });
            t1.Join();
            t2.Join();
            Console.WriteLine();
        }

        private static void myMethod(object theObject)
        {
            Console.WriteLine(theObject);
        }
    }
}

Passing Parameter with Lambda Expressions

The most powerful way to pass data to a thread is with a lambda expression. Using lambda expressions it is possible to call the method with the desired arguments. The closure property of lambda expressions also makes it possible to return a value from a thread.

/**********************************************************
* Pass using lambda, use lambda closure for return value *
**********************************************************/
using System;
using System.Threading;

namespace ThreadExample5
{
    class Program
    {
        static void Main(string[] args)
        {
            int t1Return = 0;
            int t2Return = 0;
            Thread t1 = new Thread(() => { System.Console.WriteLine("Shadow the Cat"); t1Return = 1; });
            Thread t2 = new Thread(() => { System.Console.WriteLine("Holly the Dog"); t2Return = 2; });           

            t1.Start();
            t2.Start();
            t1.Join();           
            t2.Join();
            Console.WriteLine(t1Return); // Prints: 1
            Console.WriteLine(t2Return); // Prints: 2
            Console.WriteLine();
        }
    }
}

Lambda Expression Closure Affecting Variable Values

When using lambda expressions be aware of the "closure" that can happen when using anonymous functions (lambda expressions and anonymous methods). When anonymous functions use a variable that is not passed in , it captures that variable from the scope in which it was declared, and that is the value used in the function. The example below shows how closure works with lambda expressions. In this case the x variable is stored in only one memory location and is used by both threads.

using System;
using System.Threading;

namespace ThreadExample3
{
    class Program
    {       
        static int x = 0;

        static void Main()
        {
            Thread t1 = new Thread(() => { x++; System.Console.WriteLine(x); });           
            Thread t2 = new Thread(() => { x++; System.Console.WriteLine(x); });

            t1.Start(); // Prints: 6
            x+= 5;
            t2.Start(); // Prints: 7
        }
    }
}

Top




Local Data

Local data storage is used to keep the data isolated within a thread. This ensures that each thread has a separate copy of its own data. Thread-local can be useful in optimizing parallel code, or for supporting the execution path's infrastructure. The easiest approach for thread-local data storage is to mark a static field with the TreadStatic attribute. But this only works for static data. Another approach which works for both static and instance data is to use the ThreadLocal<T> generic structure.

Using ThreadStatic Attribute for Creating Thread-Local Storage

When the ThreadStatic attribute is used, each thread is provided with its own local copy of the variable, as oppose to sharing one copy. The next example program show how the using the ThreadStatic attribute impacts the previous program example.

using System;
using System.Threading;

namespace ThreadExample3
{
    class Program
    {
        [ThreadStatic]
        static int x = 0;

        static void Main()
        {
            Thread t1 = new Thread(() => { x++; System.Console.WriteLine(x); });           
            Thread t2 = new Thread(() => { x++; System.Console.WriteLine(x); });

            t1.Start(); // Prints: 1
            x+= 5;
            t2.Start(); // Prints: 1
        }
    }
}

Using ThreadLocal<T> for Thread-Local Storage

ThreadLocal<T> can be used for both static and instance variables and allows you to specify default values. It was introduced in .NET 4. Internally the ThreadLocal instance automatically sets up the static data, performs implicit casting as necessary, and manages the data's lifetime. The program below is equivalent to the preceding example except for its use of ThreadLocal<T>.

using System;
using System.Threading;

namespace ThreadExample3
{
    class Program
    {
        static void Main()
        {
            // Set ThreadLocal using Func<int>
            ThreadLocal<int> x = new ThreadLocal<int>(() => 0);

            Thread t1 = new Thread(() => { x.Value++; System.Console.WriteLine(x.Value); });
            Thread t2 = new Thread(() => { x.Value++; System.Console.WriteLine(x.Value); });

            t1.Start(); // Prints: 1
            x.Value += 5;
            t2.Start(); // Prints: 1
        }
    }
}

Top




Shared Data

Shared fields allow the passing of data to and from threads. Shared fields also allow threads to communicate with each other. However shared fields are also the primary cause of complexity and obscure errors when multithreading. It is advisable to keep the amount of shared data to a minimum, but when it is required there are several ways to manage exclusive access to the data. One way is with synchronization constructs, which can be divided into four categories:

  1. Blocking - these wait for another thread to finish or time-out.
    • Thread.Join() - blocks the calling thread until a thread terminates
    • Thread.Sleep() - suspends the current thread for the specified amount of time.
    • Task.Wait() - waits for the Task to complete execution within a specified time interval.
  2. Locking - these limit the number of threads that can execute a piece of code.
    • Lock - statement used to obtain a mutually exclusive lock for an object.
    • Monitor - class which provides a mechanism that synchronizes access to objects.
    • Mutex - synchronization primitive that can also be used for interprocess synchronization (app and machine wide locks).
    • SpinLock - thread waits in a loop (spins) repeatedly checking until lock becomes available.
    • Semaphore - class that limits the number of threads that can concurrently access a resource.
    • Reader/Writer locks - class that supports single writers and multiple readers.
  3. Signaling - these allow a thread to pause until it receives notification.
    • Barrier - a class that enables multiple tasks to cooperatively work on an algorithm in parallel through multiple phases.
    • Monitor's Wait/Pulse methods - uses a waiting queue to signal lock's change.
    • Event Wait Handles - one thread waits until it receives notification from another.
      • CountdownEvent - a class that represents a synchronization primitive that is signaled when its count reaches zero.
      • ManualResetEvent - functions like a gate. WaitOne waits at the gate. Set opens the gate.
      • AutoResetEvent - like a ticket turnstile. Only one can go through at a times.
  4. Nonblocking - these protect a common field from being reordered or cached by the compiler's optimizer.
    • volatile - exempts fields from compiler optimiaztions that assume access by a local thread by creating memory barriers (a.k.a. memory fences). Instructs compiler to generate an acquire-fence on every read and a release-fence on every write to that field.
    • Interlocked - a class which provides atomic operations for variables that are shared by multiple threads. For example, 64 bit fields are not atomic on 32 bit operating systems. Interlocked contains the following methods: Increment(), Decrement(), Exchange(), and CompareExchange().

Besides supporting these synchronization constructs, .NET also provides a number of constructs which make use of multicore processors or multi-processors. They were introduced in .NET 4 are collectively known as the Parallel Framework (PFX). Included in this framework is the Parallel class and the Task Parallel Library (TPL). PFX contains constructs for achieving both data parallelism and task parallelism. Of the two, data parallelism is easier to code, scales better, and is also conducive to structured parallelism (parallel units start/finish at same point). Additionally, PFX includes Parallel Language Integrated Queries (PLINQ), which is highly automated, fast, and easy to implement.

.NET provides further support with a number of thread-safe collection classes found in the Systems.Collections.Concurrent namespace. These collections include: BlockingCollection<T>, Concurrent Dictionary<T>, Concurrent Queue<T>, Concurrent State<T>, and Paritioner. Below is an example Producer/Consumer example program that uses BlockingCollection<T>.

Producer/Consumer example program that uses BlockingCollection<T>

using System;
using System.Collections.Concurrent;
using System.Threading.Tasks;

namespace test
{
    class Program
    {
        static void Main(string[] args)
        {
            BlockingCollection<int> bCollection = new BlockingCollection<int>();

            // Consumer 1
            var consumer1 = Task.Factory.StartNew(() =>
            {
                foreach (int data in bCollection.GetConsumingEnumerable())
                    Console.Write("c1=" + data + ", ");
            });

            // Consumer 2
            var consumer2 = Task.Factory.StartNew(() =>
            {
                foreach (int data in bCollection.GetConsumingEnumerable())
                    Console.Write("c2=" + data + ", ");
            });

            // Producer of 75 intergers
            var producer = Task.Factory.StartNew(() =>
            {
                for (int i = 0; i < 75; i++)
                    bCollection.Add(i);
                bCollection.CompleteAdding();
            });

            // wait for producer to finish producing
            producer.Wait();

            // wait for all consumers to finish consuming
            Task.WaitAll(consumer1, consumer2);
        }
    }
}

Top




Thread Pools

A thread pool is a mechanism for managing thread usage. A thread pool reduces the resources required to create and destroy resources and also provides a limit for the maximum number of threads which can be used at one time. Once the thread limit is reached, jobs queue up and start as threads become available. While the reduction of overhead and throttling of threads are desirable traits, there are some limits when using thread pools. For example, the threads in a thread pool:

  1. Can not be named. Which can make debugging more difficult.

  2. Are always background threads. You can not create any foreground threads from the thread pool.

The upper limit of a thread pool is set with ThreadPool.SetMaxThreads. The ThreadPool.SetMinThreads property is an optimizing technique which instructs the pool manager not to delay in the allocation of threads up to the minimum limit. Increasing the minimum thread count does not cause the threads to be created immediately. Instead threads are created on demand. The time delay before creating threads above the minimum number help to prevent a large burst of small tasks from creating large number of threads which would decrease performance because of all the thread overhead.

A number of features in .NET use the thread pool implicitily, such as ASP.NET, WCF, Web Services, and any of the methods with end in "async". The thread pool can be used explicitly by:

  1. Calling ThreadPool.QueueUserWorkItem
  2. Thru asynchronous delegates
  3. Thru BackgroundWorker
  4. Thru the TPL or PLINQ

Note: The Task classes in the TPL were introduced in .NET 4.0 and are faster and more convenient replacements for the older constructs. The Task class is a replacement for ThreadPool.QueueUserWorkItem and Task <TResult> is a replacement for asynchronous delegates. The Task class is described in the Parallel Programming article.

The difference between explicitly creating tasks with ThreadPool.QueueUsersWorkItem and asynchronous delegates is that asynchronous delegates can pass arguments in both directions allow unhandled exceptions to be rethrown on the original thread (so they do not require explicit exception handling). The following program uses an async delegate to use a thread from the thread pool. The program creates 5 seconds of work on the control thread (thread 1) and 10 seconds of work for the thread from the thread pool (thread 3). Some of the work is being performed by both threads at the same time. The control thread finishes first and waits for the pool thread to finish. Total run time is slightly over 10 seconds.

Using Pool Thread Via Asynchronous Delegates
.Simple Asynchronous Method Call via Delegate


using System;
using System.Threading;
using System.Diagnostics;

namespace DelegateAsynchronous
{
    delegate void TheDelegate(string s);

    static class StatClass
    {
        public static void DelMethod(TheDelegate parmDelegate)
        {
            System.Console.WriteLine("DelMethod: Started on thread id: {0} at: {1}",
                                      Thread.CurrentThread.ManagedThreadId,
                                      DateTime.Now.ToString("hh:mm:ss tt", System.Globalization.DateTimeFormatInfo.InvariantInfo));
            /******************************
             * Asynchronous Delegate Call *
             ******************************/
            IAsyncResult asyncRes = parmDelegate.BeginInvoke("I am the Delegate Parameter - Asynchronous", null, null);

            // 5 Seconds of Work
            for (int i = 0; i < 5; i++)
            {
                System.Console.WriteLine("*** Doing Work on on thread id: {0} at: {1}",
                                          Thread.CurrentThread.ManagedThreadId,
                                          DateTime.Now.ToString("hh:mm:ss tt", System.Globalization.DateTimeFormatInfo.InvariantInfo));
                System.Threading.Thread.Sleep(1000);
            }

            parmDelegate.EndInvoke(asyncRes);
            System.Console.WriteLine("DelMethod: Ended on thread id: {0} at: {1}",
                                      Thread.CurrentThread.ManagedThreadId,
                                      DateTime.Now.ToString("hh:mm:ss tt", System.Globalization.DateTimeFormatInfo.InvariantInfo));
        }
    }

    class Program
    {
        static void Header()
        {
            System.Console.WriteLine("*********************************************************");
            System.Console.WriteLine("***   Asynchronous Method Call with Passed Delegate   ***");
            System.Console.WriteLine("*********************************************************\n");
        }

        static void TheMethod(string s)
        {
            System.Console.WriteLine("TheMethod: Started on thread id: {0} at: {1}",
                          Thread.CurrentThread.ManagedThreadId,
                          DateTime.Now.ToString("hh:mm:ss tt", System.Globalization.DateTimeFormatInfo.InvariantInfo));
            Console.WriteLine(s);

            // 10 Seconds of Work
            for (int i = 0; i < 10; i++)
            {
                System.Console.WriteLine("*** Doing Work on on thread id: {0} at: {1}",
                                          Thread.CurrentThread.ManagedThreadId,
                                          DateTime.Now.ToString("hh:mm:ss tt", System.Globalization.DateTimeFormatInfo.InvariantInfo));
                System.Threading.Thread.Sleep(1000);
            }
            System.Console.WriteLine("TheMethod: Ended on thread id: {0} at: {1}",
                          Thread.CurrentThread.ManagedThreadId,
                          DateTime.Now.ToString("hh:mm:ss tt", System.Globalization.DateTimeFormatInfo.InvariantInfo));
        }


        static void Main()
        {
            Stopwatch myStopWatch = new Stopwatch();
            myStopWatch.Start();
            Program.Header();
            System.Console.WriteLine("Main: Started on thread id: {0} at: {1}",
                          Thread.CurrentThread.ManagedThreadId,
                          DateTime.Now.ToString("hh:mm:ss tt", System.Globalization.DateTimeFormatInfo.InvariantInfo));

            // Define the delegate
            TheDelegate myDelegate1 = new TheDelegate(TheMethod);

            // Invoke the delegate
            StatClass.DelMethod(myDelegate1);

            System.Console.WriteLine("Main: Ended on thread id: {0} at: {1}",
                          Thread.CurrentThread.ManagedThreadId,
                          DateTime.Now.ToString("hh:mm:ss tt", System.Globalization.DateTimeFormatInfo.InvariantInfo));
            myStopWatch.Stop();
            System.Console.WriteLine("\nTotal run time is: {0}\n", myStopWatch.Elapsed);
        }
    }
}

Top



Reference Articles

Top