April 23, 2014
Hot Topics:
RSS RSS feed Download our iPhone app

Multithreading in .NET Applications, Part 2, Page 2

  • June 19, 2003
  • By Mark Strawmyer
  • Send Email »
  • More Articles »

Protecting Operating System Resources

When two or more threads within or across processes need to access an operating system resource, there needs to be a control to limit conflicting access of the resource. Otherwise, it would be possible for one thread to do something to interrupt another. System.Threading.WaitHandle provides the base class for controlling exclusive access to operating system specific resources. It is used for synchronizing resources between managed and unmanaged code and exposes operating system specific functionality such as waiting on multiple resources. The classes derived from WaitHandle must implement a signaling instrument to indicate taking or releasing exclusive access to a resource.

System.Threading.Mutex is a class that is derived from WaitHandle. Only one thread at a time can own a mutex. Prior to accessing the resource, each thread tries to gain ownership of the mutex using one of the request signaling methods. If the mutex is owned, each thread waits for ownership of the resource before continuing. When the thread is done with the mutex, it signals completion through a call to the ReleaseMutex() method.

Sample Code Listing

The following code contains a sample console-based application. The application creates five different threads that all compete to use the same simulated system resource. Each thread is forced to wait until it can have exclusive access to the resource before it can continue.

using System;using System.Threading;namespace CodeGuru.MultithreadedPart2{  /// <remarks>  /// Example console application demonstrating the use of a mutex.  /// </remarks>  class MutexExample  {    // Control access to the resource    private static Mutex _mutex = new Mutex();    /// <summary>    /// The main entry point for the application.    /// </summary>    [STAThread]    static void Main(string[] args)    {      // Create processing threads to use the system resource      for(int i = 0; i < 5; i++)      {        // Call SomeProcess on a new thread        Thread thread = new Thread(new ThreadStart(SomeProcess));        thread.Name = String.Format("Thread{0}", i + 1);        thread.Start();      }    }    /*     * Represents some process to consume a resource.     */    private static void SomeProcess()    {      UseSimulatedResource();    }    /*     * Represents a system resource that must be synchronized.     */    private static void UseSimulatedResource()    {      // Wait until it is safe to use the resource      Console.WriteLine("{0} waiting on resource",                         Thread.CurrentThread.Name);      _mutex.WaitOne();      Console.WriteLine("{0} has resource",                         Thread.CurrentThread.Name);      // Put the thread to sleep to pretend we did something      Thread.Sleep(1000);      // Release the resource      Console.WriteLine("{0} done with resource\r\n",                         Thread.CurrentThread.Name);      _mutex.ReleaseMutex();    }  }}

Testing the Sample

Run the sample console application provided above. The output will vary according to how quickly each thread starts and the order the resource is requested. The result is likely to vary slightly across iterations. The output will look roughly as follows:

Possible Enhancements

Despite the coverage I've done in Parts 1 and 2 of the multithreaded exploration, there are still other topics to cover. Some of the additional topics that you should consider exploring for yourself are as follows:

  • C# has a lock statement (SyncLock in VB.NET) that can be used in place of the Monitor.Enter and Monitor.Exit methods. Rather than have the Monitor.Enter at the beginning of the code region and the Monitor.Exit at the end, the lock encapsulates all the code within a block using braces {}.
  • A condition known as a deadlock can occur when two threads are waiting for the same resource. This is a critical condition that you must take care to avoid when synchronizing resources. A common way for a deadlock to occur is for A to be waiting on B to complete and at the same time B is waiting on A to complete. To see an example of a deadlock condition, comment out the line of code in the Dequeue method of the Monitor example where the Monitor.Pulse(this._queue); occurs just after the lock statement. This will make the enqueueThread wait on the Queue resource to be released and the dequeueThread resource to wait on the enqueueThread to do something.
  • A common scenario is to create a thread that spends much of its lifetime in a sleeping state or waiting for an event to occur. This can lead to inefficient use of resources. A ThreadPool is an object that allows you to be more efficient by having a thread that monitors wait operations for status changes. When a wait operation completes, a worker thread from the thread pool executes the appropriate callback function to resume execution of the thread. This allows for more efficient use of threads and resources.

Future Columns

Due to the requests and suggestions I received for Part 1 of multithreading, the next column is going to be Part 3 on multithreading. It will cover using the classes in the System.Net and System.Threading in combination to create a server-based listening application that processes requests using multiple threads. If you have something in particular that you would like to see explained here, you can reach me at mstrawmyer@crowechizek.com.

About the Author

Mark Strawmyer, MCSD, MCSE (NT4/W2K), MCDBA is a Senior Architect of .NET applications for large- and mid-size organizations. Mark is a technology leader with Crowe Chizek in Indianapolis, Indiana. He specializes in architecture, design, and development of Microsoft-based solutions. You can reach Mark at mstrawmyer@crowechizek.com.





Page 2 of 2



Comment and Contribute

 


(Maximum characters: 1200). You have characters left.

 

 


Sitemap | Contact Us

Rocket Fuel