Increase .NET Performance Using Threads
Increase your users' perceived performance of your .NET application by using multithreading.
Increase .NET Performance Using Threads
Increase the perceived performance of your .NET applications using multiple threads.
Technology Toolbox: C#
You can have a huge impact on customer satisfaction by improving your .NET application's perceived performancethat is, users' perception of the application's performance. You can have the fastest box in the world running the latest OS and an application developed with the latest tools, but if users "feel" that the application is slow, it is slow.
You can help build a higher-performance application by using threads to perform some of the app's nonessential work in the background. This allows users to interact with the user interface with minimal interruption, and thereby increases their perception of application performance.
I'll show you some templates for WinForms applications and how to use threads to solve some common issues. I'll also address some of the problems multiple threads can give youbelieve me, the world of threads can get a little tricky at times. Let's dive right into some sample applications and see how you can use threads to help increase users' perceived performance of a WinForms app.
A thread is generally defined as a single unit of code execution running within a process and granted execution time (called a timeslice) by the operating system. In .NET, your Main (or start-up) method constitutes the main thread for the application. This main thread can, in turn, spin off numerous worker threads, which can also spin off other threads. All of these threads run under a single .NET AppDomain. The operating system is responsible for giving each thread processor time based on the thread's execution properties and the number of CPUs available to execute threads. (.NET also provides you with advanced methods for controlling which threads run under which CPUs, as well as access to the thread pool itself, but these are more advanced topics best discussed later.)
Spin Your First Thread
I won't bore you with the computer-science details of preemptive multitasking. Instead, let's take a look at how to spin off a worker thread. .NET encapsulates all of the thread-management functionality inside the System.Threading namespace, so you need to reference that namespace anywhere you want to work with threads:
public ThreadStart ts_ThreadEntryPoint;
public Thread t_myThread;
ts_ThreadEntryPoint = new ThreadStart(myMethod);
t_myThread = new Thread(ts_BigOperationEntryPoint);
A thread needs an entry point called a ThreadStart object. The ThreadStart merely points to the procedure or method where you want the thread to begin. Once defined, you create the new thread (passing in the entry point) and then start it. Once started, the thread executes under a different thread context and doesn't have direct access to any information from the procedure or class from which the thread was started. This is an important concept to understand once we being talking about updating a UI form from a worker thread.
Now that you know how to start a thread, let's dive right in and try out our first example. Let's say there's a control or menu item on a form that users can click on that requires a long-running business operation (like a file archive). Without threads, you could simply create the file archive on the control's event. But, if the archiving took 30 seconds, users couldn't work on anything else in the application for 30 seconds. Their perception of the application's performance would increase substantially if you could return UI control to them sooner. The key here is that you need to select only those operations that are truly not key to the users' interaction with the application and are safe to run asynchronously from the users' interaction with the UI form.
I've created a sample form that has a simple textbox, progress meter, and a set of buttons to Start, Pause, and Cancel what I call the BigBusinessOperation. As the BigBusinessOperation runs in the background, you want to give users information on how close it is to completing, but still allow them to interact with other sections on the form (see Listing A).
The BigBusinessOperation is spun off as a worker thread when users click on the Start button. Notice that the name of the thread and its priority are also set before the thread is started. A thread can have these priorities: ThreadPriority.Highest, ThreadPriority.AboveNormal, ThreadPriority.Normal, ThreadPriority.BelowNormal, and ThreadPriority.Lowest. The thread scheduler uses these priorities to schedule timeslices for all threads, so take care to use the right priority as it might negatively impact the performance of your UI thread (main thread).
Now take a look at the worker thread, which is the BigBusinessOperation procedure. I've created a simulation of a long-running operation that loops 20 times and sleeps for one second between each loop iteration. This simulates a 5 percent completion of a business operation per loop iteration. You need to update the progress meter after each loop, so the thread needs access to that control.
Update the UI From a Worker Thread
Here's where a lot of threading application mistakes are made. Sure, you could pass a reference to the control and then update it within this loop, but remember that this is the worker thread, and you would end up interfering with the events over on the UI thread side if you tried to update a control outside of the UI thread. It might work for a while, but eventually you'd end up either with application deadlock (a condition where two threads are waiting for the other to complete or release a common item) or the failure of your WinForms application to repaint properly. You need to provide a safe way for the worker thread to let the UI thread know that it's time to update the progress meter. You do this through the use of delegates.
A delegate is just a fancy declaration and instantiation of a function pointer. You can define a delegate to a progress meter update method at the top of your class, then create an instance of the delegate at the beginning of the BigBusinessOperation:
public delegate void
ProgressDelegate fpr = new
The UpdateProgressMeter method inside your UI form class takes a string as a parameter and tells it what the progress meter's value should be set to. Invoking this method from the worker thread tells the UI thread that it needs to schedule some time within its thread to make this call. Notice that you have to pass parameters as an array of objects:
this.Invoke(fpr, new object
The Control.Invoke() method is synchronous, meaning that it won't return until the UpdateProcessMeter method in the UI thread completes. Once invoked, the UpdateProgressMeter can update the progress meter safely and force a refresh of the control (see Figure 1 to view the application in the middle of running the BigBusinessOperation).
You can also create buttons to Pause and Cancel the BigBusinessOperation (if it is running). Pausing a thread is quite simple. If the thread's IsAlive property is true, the thread is working (not necessarily executing at that given time, but not canceled). You need to check this property; otherwise, you might generate an exception by attempting to cancel or pause a thread that no longer exists. This statement pauses a thread's execution and puts it into a Suspend state:
And, of course, you can resume a thread using the Resume() method:
You can also cancel the execution of a worker thread. There is a lot of debate on the best way to cancel a thread, but I use the Thread.Abort() method in this example. Invoking this method by itself is not enough; it merely generates a ThreadAbortException within the worker thread. If you don't handle this exception, your worker thread will continue to run even though your UI application has exited already. Test this by running this example from the VS.NET IDE, but comment out the return statement inside the exception handler code in the worker thread. Then, bring up the application, press Start, and exit the application. Notice that the process is still resident in memory (use Task Manager to see this), and also notice that the Pause and Stop VCR buttons within the IDE debugger are also active, signaling that active code is still running.
Terminate a Worker Thread Safely
You can quickly ensure that all worker threads terminate when the UI terminates by setting the IsBackground property of each worker thread to true before starting the thread. This signals that the worker threads should terminate when the thread that spun them terminates.
If you plan to cancel a thread with a Thread.Abort(), you must wrap the entire thread within a Try?Catch block and handle the ThreadAbortException:
// Cleanup code goes here
// Break out of the worker thread
In addition, you need to invoke the Thread.Join() method, which blocks the current thread (in this case, the UI thread) until the worker thread has finished executing. This is why you need to break out of the worker thread by capturing the ThreadAbortException. If you fail to break out of the worker thread in this exception handler code, you generate a Thread Exception because the worker thread continues to run after it's been requested to abort (see Figure 2):
// Cancel the worker thread
// Block this thread until worker thread
// termination has concluded
Lastly, you always need to place a Thread.Sleep() inside your worker thread if you intend on running it in a continuous loop. If you don't, then the only time your thread will get interrupted is by the operating system, and you'll never receive the "soft" interrupts for Abort, Interrupt, and others. At a minimum, you should give up the remainder of your worker thread's timeslice at some point within the loop. Passing in zero tells .NET to give up the remaining timeslice for the thread; otherwise, the parameter is the number of milliseconds to sleep:
You might have noticed that all the code was local to the Form classsomething that's rarely done in the real world. In the next example, I'll show you how to spin off threads using methods of business objects external to the Form class, as well as using the BeginInvoke() and EndInvoke() methods of a delegate to communicate asynchronously back to the main thread. Make sure not to call BeginInvoke or EndInvoke if the parent thread (like the UI) has terminated already while the worker thread is still active. This will lead to a permanent blocking condition.
Let's create a simple text receiver class that sits and waits for text to be put into a shared buffer. Once the text is inserted, the method updates a textbox field in the UI form with the text. I realize that using a message queue product (such as MSMQ) would be a better design solution, but I've chosen this example specifically to outline some of the perils of asynchronous delegates, as well as obtaining and releasing locks on shared variables.
Notice that a few things have changed now that you've moved the method to be threaded into its own class (see Listing B). First, you no longer have access to the instance of the form that's running (which you need to create your UI update delegate). You must always have the instance of the form class and not the definition itself to create a delegate properly. You'll get a compile error if you merely try to reference the MainWindow class and not the instance of that Form class. To accomplish this, create a Form property within the Receiver class that needs to be set before spinning off GetMessages() as a thread. This gives the instance of the Receiver class access to the MainWindow form instance, and then you can create the delegate:
fprUIUpdate = new
The code is similar in that it loops waiting for something to happen. In this case, it waits for text to be placed into a shared message buffer (which you can implement using a MessageBuffer class). Herein lies the next threading topic, called thread synchronization. The UI form sets this message buffer and the worker thread gets the contents, but you need a way to make sure the two threads don't clobber each other while trying to access the shared area. This could lead to something called a race condition, where separate threads try to access and change a common area at the same time. One way to eliminate a race condition is by using a ReaderWriterLock.
Enable Thread-Safe Synchronization
A ReaderWriterLock enables threads to obtain read or write locks on an object, preventing others from accessing and/or changing the contents of the shared area. Threads that ask for a read lock through the AcquireReadLock() method can acquire concurrent locks as long as there are no writer locks on the object. No reader or writer locks will be permitted if there's a writer lock on the object (which you create with the AcquireWriterLock() method).
Create a ReaderWriterLock object inside your MessageBuffer class that you'll use to signal the locking and unlocking of the shared message buffer area value (strBuffer). Create both values as static within the class so that you don't have to instantiate them and pass them back and forth between the threads that need to use strBuffer. A thread uses this code when it needs to acquire a lock:
The value of 1000 signifies the number of milliseconds to wait for a writer lock to be acquired. If a writer lock cannot be acquired during that time, an application exception is generated, which you need to handle. Once you acquire the writer lock, you can manipulate the strBuffer area string and then release the lock:
Now that you've stored the text, it's time to notify the UI update delegate. This time, instead of calling Invoke on the form control (as you did in the previous example), call BeginInvoke on the instance of the delegate to initiate the UI update. The difference is that Invoke() is not an available method for class delegates (as opposed to WinForms control delegates, which you used in the first example):
iasResult = fprUIUpdate.BeginInvoke(strBuffer,null,
With BeginInvoke, you pass the defined procedure parameters first. The compiler knows what arguments need to be passed because the method signature is known at declaration time. The next parameter is a pointer to an asynchronous callback routine (which I'll talk about in a moment), and the last parameter is an array of objects you can pass (just like you did with Invoke). BeginInvoke immediately returns an IASyncResult object, which contains state information about the asynchronous operation. You'll pass this object later to the EndInvoke method. MSDN documentation recommends that you always call EndInvoke sometime after BeginInvoke to retrieve the asynchronous callback final state; otherwise, a potential memory leak can occur.
You can also pass another delegate as an asynchronous callback for BeginInvoke, which will be called once the initial delegate completes:
iasResult = fprUIUpdate.BeginInvoke(
Note that myCallbackDelegate might not actually be called under the thread context of the thread that called BeginInvoke in the first place. This means that you shouldn't attempt to do any locking, monitoring, or other thread synchronization in an asynchronous callback.
Once you finish whatever asynchronous logic you want to do in your thread, it's time to call EndInvoke to get the status:
If the UI Update delegate returned a value, it would be returned through the EndInvoke call:
result = delegate.EndInvoke(iasResult);
If you call EndInvoke from the worker thread, then that thread will be blocked until the asynchronous delegate completes. No blocking would occur if you manage to call EndInvoke from within the asynchronous delegate (which should be thought out carefullyremember that the delegate could be running under a different thread context).
The UI update delegate created in the worker thread points to UpdateReceiveBox (see Listing 1). You set a flag called boolWriting to signal to the event handlers in the UI thread that the asynchronous delegate is doing some UI work. Without this flag, you'd end up with a deadlock situation if you click on the Send button in the middle of a receive operation by the worker thread. In that scenario, the worker thread would be blocked waiting on the EndInvoke (or the completion of the UI Update delegate), but would hold the ReaderWriterLock. Meanwhile, the UI thread would be blocked in the Send button handler, waiting for the ReaderWriterLock to be released. The UI thread would be blocked, so the UI update asynchronous delegate wouldn't get any time to complete, and you'd have an application deadlock. Prevent this by using the flag in the Send button event handler to not allow another Send until the worker thread finishes its receive.
Lock Critical Code Sections With Monitor
Let's talk about managing critical sections of thread code. A critical section of code is defined as code that only one thread should be able to execute at one time. Take the old ATM example: If my wife and I go to two different ATMs at the same time, we should not be able to withdraw money at the exact same time. The pseudo code would look something like this:
Lock Critical Code Section
Check Bank Balance
If Bank Balance Is Enough to
Unlock Critical Code Section
You can accomplish this in .NET using the Monitor class. Monitor.Enter() locks a critical code section, and Monitor.Exit() unlocks a critical code section. When using Monitor.Enter(), the current thread is blocked until the critical code section is unlocked. There is no timeout interval for Monitor.Enter(), so you can use Monitor.TryEnter() if you want to make sure that your code never blocks infinitely. It returns true if the critical code section lock is obtained, or false if not.
You need to pass an object or value to Monitor for locking. A common technique is to use a boolean, integer, or object value to represent a lock value on something larger (like a large object). I've seen many examples where the "this" object is used for Monitor, but you should avoid this due to the overhead of locking an entire class instantiation (like a form).
As an example, let's use the Monitor class to lock a critical section of code that opens and appends a log entry to a log file. You obviously can't have two threads updating the same file at the same time, so you can consolidate the logging into a LogIt class, with a single method (LogMessage) that's responsible for adding an entry to the log file (see Listing 2).
Once you create the class, you need to make sure you instantiate it and pass the reference to the same instantiation of every thread that uses it. You need to make sure that both threads use the same context when they call the LogMessage method. If they don't, then Monitor.Enter() will always pass because the value that you lock on it won't be owned by the same thread that has it locked already. This also means you can't use a static method and static variables to accomplish the goal. You must pass the instantiated LogIt object to all of your threads that plan on using it.
This brings us to another frequently asked question about threads: How do I pass parameters to threads? It's quite easy, especially when you plan on creating classes that contain your methods that will end up running as threads. Once you create the class, you can easily create a property or value within the class that holds the information you want to pass.
This happens to be the LogIt object in the Logger example. Create two classes: one for prime number generation (Primes) and another for factorial generation (Factorials). Each class has a property to hold an instance of the LogIt object, and also contains a method that you'll use as the starting point for the thread (see Listing 3).
The Primes class contains the GeneratePrimes method, which is the starting point for one of your threads. The main thread spins off two worker threads: one for primes and another for factorials. Through the centralized use of the LogIt object and Monitor class, both threads can safely and asynchronously log prime numbers found and factorials computed.
You can increase users' perceived performance of a .NET application through the use of threading. Should everything be threaded? Certainly not. But you should now have a good understanding of how you can implement threads in your .NET application, while staying away from some of the most common pitfalls.