Photo Credit: Pulpolux !!! @ Flickr Despite being considered a small player in the insurance field, Mike I.'s company writes $1.1 billion in premiums annually and has carved itself a nice niche in the area of non-standard automobile insurance. Non-standard is for drivers who are rejected due to things like too many speeding tickets, fender benders or DUIs. Like all other insurance companies, Mike's relies on complex custom software to quote and write its policies.

Complex is a bit of an understatement. Because the company does business in more than 40 states, it has to comply with each and every state's specific insurance regulations. Certain types of coverage aren't allowed in some states, deductibles vary everywhere and the limits seem to randomly change. One of the biggest challenges with such complex software is that testing becomes very difficult.

For years, the testing strategy was entirely manual. Testers would enter thousands of test cases -- each taking a good five minutes or so. After a while, management multiplied that by a bimonthly release and realized that there were a lot of hours tied up in testing that could be automated with data-entry scripts. In an effort to save more money, management outsourced the automation project to an offshore team.

After spending nearly a year in development, the offshore team delivered the Policy Testing Suite (PTS). It was a stand-alone Windows app that tied into the policy systems by binding to fields using the Windows API and provided testers with a state-specific library of macros that would fill out the various forms and submit them. Behind the scenes, the macros were simply a collection of VBScript files that were to be created and maintained by Mike's team.

It didn't take too long for Mike to notice a fundamental flaw in the PTS design -- there was no way to share scripts between states. Although each state had its unique set of requirements, there was a lot of common functionality such as the insured's name, address, phone number, etc. To work around this, they had to simply copy/paste scripts between states and change them as needed. Any changes to the common areas needed to be applied to all 40-plus versions of the script. Though testing time was reduced, programming time quickly skyrocketed.

A Better Solution

Well aware of the maintenance problems with the thousands of various VBScript files, management had the offshore team completely rewrite PTS. The new version, the offshore developers promised, would not only ease maintenance, but would be data-driven in an extensible manner so that business analysts could develop the macros instead of programmers.

At the core of PTS version 2 (PTSv2) was the concept of policy actions, which were merely a collection of VBScript subroutines that were defined like this:

Sub SetPrimaryInsuredFirstName( target, value )

This allowed business analysts to define test cases as a series of named actions (SetPrimaryInsuredFirstName) each with a target (Policy) and a value (John). To make test case development even easier, PTSv2 used Microsoft Excel spreadsheets as its data store. To add new test cases, all one had to do was edit a spreadsheet on a share folder and voila! it would appear on testers' consoles.

It didn't take long for Mike to notice a fundamental flaw in the design: there was still no way to share between states. However, because automated test-case maintenance had fallen on the shoulders of business analysts, he simply let them worry about the redundancies. Besides, his team had their own issues. Because of the data-driven nature of PTSv2, debugging problematic test cases became nearly impossible.

The Slowdown

As the months passed and more and more test cases were migrated to PTSv2, testers noticed that data entry was slowing down. In the old system, clicking a testing macro resulted in the forms being filled out instantaneously. In the new system, however, they could actually see each field being filled out.

After several more months of effort, all of the test cases were moved to PTSv2. But by that time, testing had ground to a halt. Clicking a macro resulted in a flurry of disk activity and a several-second wait between each field. On a big testing day, it would take nearly 20 minutes for PTSv2 to enter a single test case.

Realizing that testers could actually type in the data faster than their computer could, management declared automated testing to be a failed experiment and required that all testing be manual once again. But Mike wasn't convinced, and investigated why the scripts ran so slowly.

What he found in the code made him want to cry. The test machines had a dual-core processor and 2GB of RAM, and each of the Excel spreadsheets was about 1MB. Whenever PTSv2 needed to find a cell value, it would open the file over the network, parse every single line and then extract the single value from the row. Because this happened thousands of times per test case, it was no wonder why it was so slow.

Fortunately, after a bit of convincing, management agreed to redevelop the PTS and, after a couple of months with a new team, PTS is once again entering data faster than the testers can.


Slow-Motion Automation was originally published in Alex's DevDisasters column in the Sep 1, 2008 issue of Redmond Developer News. RDN is a free magazine for influential readers and provides insight into Microsoft's plans, and news on the latest happenings and products in the Windows development marketplace.

[Advertisement] BuildMaster allows you to create a self-service release management platform that allows different teams to manage their applications. Explore how!