In-Depth

Write a Web-Based Unit Test

Take advantage of testing features in Visual Studio Team System for Software Testers to produce higher-quality Web applications.

Technology Toolbox: VB 2005, Other: Visual Studio Team System Edition for Software Testers

Web applications have a solid and growing presence in enterprise-level businesses.

Benefits of Web-based apps include a lower cost of ownership due to central deployment and maintenance, as well as client/operating system independence. At the same time, Web interfaces are becoming both increasingly complex and more responsive with the introduction of Asynchronous Java and XML (AJAX) into mainstream development. On top of this, service-oriented architectures—and Web services in particular—display increasing maturity, contributing to the ever greater reliance on Web-based architectures.

One aspect of Web-based development that hasn't kept pace with the rest of the innovations in this area is testing. Testing in Web-based applications is complicated by several factors, such as a lack of cost-effective tools for both Web and Web-service testing, lack of flexibility in these tools, and the fact that testing tools haven't been integrated into the major IDEs. Of course, it's urgent that developers implement more robust testing practices for a variety of reasons, not least of which include increasing use of, and reliance on, the Web for all manner of business transactions, from online payment to voter registration. There are also ongoing and significant security concerns regarding monetary transactions and personally identifiable information (PII). For all of these reasons (and many more), it is critical that developers test their applications as strenuously as possible.

The responsibility is on us, the developers, to ensure that the applications we put out can withstand the normal range of both authorized and unauthorized usage. Visual Studio Team Edition for Software Testers provides a host of tools and extensibility options to support advanced Web application testing, basic Web-service testing, and basic security testing. While VSTS doesn't provide every tool you might need, it does provide a solid set of tools, and many third party vendors have begun integrating their test tools into Team System.

I'll show you how to take advantage of several of the tools in VSTS to build Web tests and Web-service tests, as well as how to data-drive those tests and then perform load testing for a real-world simulation that indicates how your environment will perform once your application is deployed.

Note that you should always perform unit tests on your code before moving on to Web tests.

Web testing differs from the client-based unit testing you must perform. Web testing assumes that your application has passed the unit tests, and now you need to perform functional testing. That is, you need to determine whether all of your code units that you've tested previously work together.

You nearly always perform functional tests from an end-user perspective. Such a test should document the normal course of a business function from beginning to end. For example, in an expense application you may have unit tests that cover adding a trip, adding an expense item, and retrieving a list of all user trips. However, a functional test would test what happens when you list all of the trips, a user adds a new trip (or edits an existing trip) and adds several expense items.

It is usually the QA person's responsibility to perform these tests. Developers or functional analysts are responsible for this in some cases, including most smaller projects. In other cases, it is entirely the QA team's responsibility, and the developers never perform these tests.

Web Services Behave Differently
Web services are an exception to this general rule. It might help to think of Web-service testing as an intermediate type of test. You've written unit tests to test the business logic and functional tests cover the user interaction—Web services fall somewhere in between these two examples.

Theoretically, you test the Web services your app relies on when you run your Web tests. It's a question of when you write tests for a given component that is the core issue for testing such services. For example, you can test your Web applications from beginning to end with your Web tests. However, you should also anticipate that the person who writes the Web services layer isn't the same person (or group) who writes the user-interface layer. Also, your Web tests should cover the functionality of a Web service, as well as the result of a Web service call.

You need to test your site under load after you create your basic functional tests and ensure that they all work. Load tests typically require that you test multiple areas of functionality, rather than a single area. However, your goal is a little different when performing a load test. You do want the functions to work, but you don't necessarily care if all of the rules associated with a given test pass. A load test is more of a hardware/environment test than a test of the software. The types of information you are looking for in this type of test include feedback about the response time and information about whether any part of the environment goes down at any point. You also want to know about CPU usage statistics and how your database performs.

It's always a good idea to have a thorough understanding of the architecture of the application you want to test before you begin testing the application. This enables you to put the results you receive into the proper context.

Assume that you have a Web-based Expense report application (see Figure 1). This data model describes the application's data structure (see Figure 2). The Web application connects to a Web service that pulls data from a SQL Server 2005 database. You need to implement a minimal amount of code to get this application running, which makes demonstrating the process of testing a Web app easier.

Unfortunately, the Web-service testing facilities require a lot of manual work on the part of the developer in version 1 of Team System. Once you get the hang of it—and I'll show you a trick to make using the testing facilities easier—they are much easier to test. But there remains no easy solution for validating a result.

Begin by downloading the online testing solution and opening the WebTesting solution in Visual Studio. Select Test, New Test from the main menu. Next, select Web Test from the Add New Test dialog, and name it GetTripsTest. Then select Create New Visual Basic test project from the Add Test To drop down. Note that itโ€™s always better to separate out tests from code for deployment purposes. Next, call the new project WebServiceTests. After you implement these steps, Internet Explorer opens, bringing up a recording window. Now click on the Stop Recording button. In Visual Studio, you will see a blank window with the GetTripsTest as the only node in the tree. Right-click on the node and select Add Web Service Request. This adds a URL node and a string body node.

Test a Web Service
Testing a Web service requires that you physically enter the body of the request. This isn't an intuitive process. You can make your job easier by right-clicking on the service.asmx file in the ExpenseServices project and selecting View in Browser. Next, select the GetTrips link on the page displaying the list of available Web services. The resulting Web page displays the request and response in three formats: SOAP 1.1, SOAP 1.2, and HTTP POST. Copy the SOAP 1.1 request and paste it into the string body (see Figure 3), then select the content type of text/xml (the only option). This is the easy way to write a Web service request. Next, update the URL with the URL of the service (http://localhost:9947/ExpenseService/Service.asmx/GetTrips, in this case). This is the URL returned after you click on Invoke on the test page.

At this point, you have a Web service test that always passes because it doesn't validate anything. The test database contains two trips; you want to make sure that both trips are returned. The data in a Web service test comes back as straight XML, so figuring out if the test passed or not is easier said then done. You can handle this in one of two ways. First, you can convert this test to a coded Web test by selecting the Generate Code icon from the top of the window and handling the PostRequest event, which lets you read the response (XML) into an XML document or a DataSet (this requires a little more work). Or, you can validate that the correct data is in the result using a validation rule. The article's sample uses a validation rule.

Validation rules allow you to determine whether certain items in a response match the expected results. Validation rules also let the test infrastructure mark the test as passed or failed based on the absence or presence of these values. The Web test's built-invalidation rules are Form Field, Find Text, Maximum Request Time, Required Attribute Value, and Required Tag. For this test, select the Find Text validation. Enter "Test Trip" (without the quotes) into the Find Text field. Note that you can use regular expressions here if you need to.

This aspect of your test includes two more items of interest: Pass If Text Found and Level. Pass If Text Found allows you to state that you either want to, or do not want to, find the text. Setting this value to False causes the test to pass if the text isn't found. You don't use the Level for the Web test per se; rather you use it for the load test.

Recall that I mentioned earlier that you don't care if tests pass or fail during a load test in many cases. This setting, used in conjunction with another setting in the load testing (I'll address this shortly), determines whether this validation is ignored during a load test. For the moment, leave it as is. Finally, add a second validation rule and search for the text "Trip 2" (without the quotes). If the Web test finds both of these values, the test passes.

Run the test either by clicking on the Start button at the top of the test window or by running the test from the Test Manager. The test fails if either (or both) of the values isn't found.

Test Web Applications
Theoretically, you write all the tests for all of your services (with passing results) before you test the application proper. So, you might wonder: Why do you need to test that the application of all of your services pass their tests and you're binding directly to the service? There are a couple reasons to go ahead and test, anyway. Reason number one: Testing a service doesn't mean that your navigation or the display of data is correct; that part is a separate layer. Reason number two: Many applications consist of services provided by multiple people. You can see it today in "mashup" applications, where developers use different pieces taken from different sites to build one application. You can't be sure that your services will interact correctly without testing the interface.

Testing your Web application is similar to using Application Center Test—at first, anyway. Internet Explorer opens with a recording window when you create a new Web test. You work through your application, performing whatever function you've targeted for testing. When you're finished, click on the Stop button at the top of the recording pane. The first function to test is adding a trip. Do this by selecting Test | New Test from the main menu. Next, select Web Test and call it "AddTripTest." At this point, you can either add it to the existing Web service test project or create a new test project. It's good practice to separate these tests because your Web services might be consumed independently, and you might need to provide the tests for that service to another group. So, create a new test project called "WebTests."

Note that you must run the application before you try to record a Web test when you use the development Web server. You cannot access your application with a test until the development server is instantiated and the ports are created.

When Internet Explorer is displayed, navigate to the default.aspx Web page. Next, select the Add Trip link and enter the name as "Tech Ed 2006." The description is optional, but you would test both scenarios—with and without a description—in a functional test. Enter "6/11/2006" for the start date and "6/16/2006" for the end date, when you create a new Web test, then click on the Save button. This should take you back to the default page. Click on Stop in the recording pane.

Take a careful look at the AddTripTest tree in Visual Studio (see Figure 4). This test reveals a series of URLs and metadata that were captured when running the page. VSTS records tests as a series of requests and responses, as well as all of the associated data. You need to take the time to understand how the recording flow works. For example, the second node lists the AddTrip.aspx page as the page you navigated to. However, the captured information is from the default.aspx page. It's critical to keep in mind that you cannot capture information regarding a page unless the response is from that page. The third node contains the information concerning your post.

You might wonder how you verify that the information entered was saved. Do this by checking the drop-down list on the default.aspx page. You'll encounter a problem at this point: There is no default.aspx page listed after that. You can overcome this problem in a couple different ways. First, you can make a Web service call (by inserting a Web service request) to the GetTrips method or by adding an additional Web request, setting it to the default.aspx page, and performing a validation by searching for the appropriate value on the page. (You can also re-navigate to the default page, not by using Refresh, but by clicking on the URL field and clicking on Enter, or by clicking on the Go button, which adds another URL node).

Run the Web App Test
You run a Web app test using the same steps you use to run a Web service test. One feature that I didn't note earlier is the ability to step through your test. Selecting "Run Test" (pause before starting) enables you to step through your test and examine the requests and responses one at a time.

This test should pass because there is no unique index on the name of the trip. You should be aware that your test fails when you try to play it back if you are running an "add" type of test and you enter a field that is constrained with a unique index. You can solve this problem with setup and cleanup scripts. You access these from the localtestrun.testrunconfig file, which VSTS adds to your solution automatically the first time you create a test. VSTS gives you various options at this point for changing test parameters, enabling code coverage, and setting setup and cleanup scripts. A script can be any executable with command-line parameters. For example, you might want to run the new sqlcmd.exe command line utility (SQL Server 2005 command-line tool) and pass it a script that tears down and rebuilds the database before a test run.

You're almost ready to move onto load testing, but first you should record one more test: adding a single expense item to a trip. Start recording a new Web test (call it "AddExpenseItem") and navigate to the default.aspx page. Select Tech Ed 2006 from the drop-down list, then begin adding information to the new expense item. First, add "6/11/2006" for the date and "Roundtrip airfare" for the description. Next, select Credit Card from the payment type drop-down list and enter "349.99" for the amount. Finally, click on Add to generate the new expense item.

VSTS provides additional pieces of functionality you can take advantage of in Web testing. For example, examine the tree node and you'll see that the second request says that the form-post parameter for DropDownList1=3. This is the identifier in the database.

But assume you want to test adding expense items for different trips. Also assume that you want that value to be random. You can hook up form parameters to a database to simulate a more realistic experience while simultaneously ensuring that you don't provide an invalid key value. Do this by selecting the DropDownList1=3 node from the second request, then right-click to select Properties. Click on the Value field, select the drop-down arrow, and then select Add Data Source. Next, enter the location of your SQL Server database and select the ERdata database and the Trips table.

Finally, go back to the properties for the DropDownList1=3 node and select ERdata1, Trips, trip_id in the value field. You should have a new data source and your drop-down list should be linked to the trips_id field. The other item of interest is the Access Method property of the Trips table. Select the table in the Web test and view the properties. You can select Sequential, Random, or Unique. This value has absolutely no effect on a Web test, but does count in a load test. For now, set it to Random. The order is always sequential in a Web test.

The last item to take care of is the last request. Note that there is a DropDownList1=3 node in this request as well. This is important because the value of this drop-down list is automatically pulled by the code for use in the save. So, this value must also be linked to the same data source. Set this parameter to the same value as the last parameter. Now you can run this test multiple times, but every entry won't be applied to the same trip.

This isn't useful when you perform straight functional testing, but it's critical when you perform load testing because every expense would conform to a single trip. That particular trip would load slowly, but every other trip would load quickly and provide you an inaccurate view of your Web site.

The process of hooking your tests up to a database is called data-driven testing. You can hook up almost any piece of data to a database. You can also perform Web service tests using data, but that is a more manual process. You can do this with a Web service test by finding the place in the body of the payload that you want to replace with a value from the database (after you add a data source). Next, enter ?{{DataSourceName.TableName.ColumnName}}? and a value from the database will be entered at run time.

Perform Load Testing
You perform load testing after you write all of your Web tests. Load testing can help you ensure your Web application is available, but load testing isn't specifically about Web development. What load testing does is help you test how your hardware, network, and software behave under high load. It also tests how your database handles a high number of continuous connections and how the database performs under load. The one part of a Web application that load testing does test is response times and errors.

The ideal result is one where your system (hardware/network/software) slowly increases in load and then plateaus for a while, before it gradually increases again. What you're looking to determine is whether the system can handle X number of requests without any additional work. X is whatever number you deem should be the expected load. The blunt truth is that your system will start to break down after a while: No system is infinitely scalable. When testing, your intent is to keep your system operating under normal circumstances for a reasonable cost. At a certain point, it no longer makes sense to keep investing money in infrastructure for only a handful of additional users. These decisions depend strictly on economics.

Now that you know what to look for, it's time to run a load test. Do this by selecting Test, New Test from the main menu. Then select Load Test, leave the name as loadtest1.loadtest, and add it to the WebTests project. This brings up the Load Test wizard, which allows you to configure think times, load pattern, browser type, and network speed, as well as designate which tests and counters to include. For this example, leave the scenario settings and load pattern settings exactly as they are. For the test mix, add the AddTrip and AddExpenseItem Web tests.

You can also alter the distribution of these tests. It makes sense that you add fewer trips than expense items—it's simply not logical to have more trips than expense items. Next, set the distribution to 20 percent for the trip and 80 percent for the expense test. Leave the browser settings and network settings as they are. The browser determination is nothing more than the header that is sent with the request. The network settings allow you to throttle the network to simulate connections from a dial up all the way to a LAN. Leave the counter settings as-is and set the test duration (on the run settings page) to five minutes with no warm-up.

Choose a Validation Setting
The validation setting dictates to what extent you want VSTS to analyze your validations. Leaving this setting at Low tells VSTS to analyze only validations marked as low, ignoring sections marked as medium or high. Medium and high validations will be ignored. When you click on Finish, your load test setup is complete.

A couple more items are worth noting. First, VSTS includes an additional setting for Web tests in its test configuration file. You can set a fixed number of runs, or one run per data source. These tests are only partially data driven, so you can leave this at the default value of a single fixed run. The second item worth noting is counters. By default, a threshold counter is created that says that if a wait time is more than 0.5 seconds, then the test throws an alert. When you're running this test in a development environment, you should delete the alert on this counter because your development hardware probably isn't as powerful as your test or production hardware. You can find this counter in the tree under Counter Sets, LoadTest, Counter Categories, LoadTest:Request, Counters, Avg. Connection Wait Time, Threshold Rules. Select the Compare Counters node, view the properties, and delete the counter.

You're now ready to run the test. You can view errors, counters, or values at specific points in the test as the test runs (see Figure 5). The right pane shows the number of requests at each of the sampled intervals (five second sampling per interval is the default). These values vary because of the think times. Think times are the amount of time a user spends thinking about something before navigating to another page or taking some other action. For example, consider a typical news Web site. No one clicks through every page as fast as possible; instead, visitors read an article that might be three pages long, requiring an average of three minutes per page to read. The think time in this case is three minutes. This accounts for the discrepancies in requests. You can also change the behavior by eliminating think times completely. VSTS includes several other configuration options, and you can add additional counters to the graph at any time. Behind the scenes, this data is all stored as XML, so you can slice it any way you want.

comments powered by Disqus

Featured

Subscribe on YouTube