The pain of platform possibilities

Just because the software your company develops runs on many platforms doesn't mean that every possible configuration can or should be tested.
(Page 1 of 2)


While component-based architectures allow software developers to create applications that support many different databases, servers, and operating environments, they create a quality quagmire of nightmarish proportions for software testers.

"You mean to tell me you aren't even going to test the server platform that is used by the customer who signed our largest deal last quarter?!" he bellowed."
The reason? It may take the same effort to develop an application for any ODBC-compliant database as it does for just one, but it takes a geometric multiple of that effort to test it because each and every database--in each and every potential platform configuration--must be tested. Different databases may have different reserved keywords, different sub- or supersets of ODBC support, or different constraints in different environments. Thus, each and every combination of all the elements must be tested together in order to truly assure quality.

Do the math. If your application supports four different databases on six different hardware platforms under three different operating systems, you are looking at testing the same application 72 times! Throw in other variations, like middleware or network protocols, and you are in the stratosphere for test time.

Under such circumstances, any competent, thorough software tester who takes pride in shipping a quality product is doomed to be frustrated no matter how hard he tries. Not only is it impossible to test every single configuration possibility with the time and resources available, I have worked with several companies where the test group doesn't even have access to all of the supposedly supported platforms. As a result, customers uncover critical issues in the field, which is the most expensive place to fix them.

The odds are against reining in marketing or sales by limiting the platforms, since that's where the money is. What to do?

Define your terms

To defend its borders, the test group must define them. This means it must be clearly stated, accepted, and communicated to all concerned, both internally and externally to customers, which configurations are, in fact, tested and which are not. This frees the test group from spending all of its time explaining why--of the dozens of ones it did test--it did not test the exact one the customer is screaming about.

I recommend organizing around the concept of "certified" versus "supported" configurations. A "certified" configuration is one that is actually tested, while a "supported" configuration is one the company agrees to accept responsibility for resolving if it fails. This distinction is important for three key reasons: It defines the platforms within the scope of the test effort; it identifies the potential risk of those out of the scope; and it enables a mitigation strategy for those risks.

Certified configurations

The beauty of precisely defining which configurations the test group will actually test, or certify, is it reveals to the rest of the organization the cold realities of what testing is up against. For example, I was reviewing the certified configuration list with the sales VP for a financial services software company when he was shocked to discover that the test group was not going to test the server platform of a customer. "You mean to tell me you aren't even going to test the server platform that is used by the customer who signed our largest deal last quarter?" he bellowed. "Why the ---- not?"

I smiled. "Because our purchase request for a test server was denied." He looked astounded, then promised to get us access to one, somehow.

"Great," I said. I was now on a roll: "But there's one more thing. I either need two more people, or two more weeks in every test cycle to cover this additional platform."



Page 1 of 2

 
1 2
Next Page





0 Comments (click to add your comment)
Comment and Contribute

 


(Maximum characters: 1200). You have characters left.