Stressed out from stress testing: Page 3

Posted October 26, 1999
By

Rich Levin


(Page 3 of 5)


Automating quality

Born over 100 years ago as Rochester Telephone, a small local telco, the company started buying up small Baby Bells after the AT&T breakup. A long string of acquisitions later, Frontier emerged as the country's #5 domestic long-distance carrier.

Under the leadership of CEO Joe Clayton, Frontier's IT organization has, for the past two years, been charged with aggressively moving the firm's entire computing infrastructure to the Web. The initiative is dubbed TMN, for Telco Management Network.

The e-business programming team at Frontier is turning to automated testing technologies to maintain the highest service levels of the firm's customer-facing apps. The reason: With literally all of the company's systems headed for the Web, a failure of any company system means wasting money.

"There was a time when application downtime didn't directly impact revenue," Flow says. "Today the revenue for our company is based on these Web-based apps. If any aspect is down or not doing its job properly, I'm losing revenue."

The laundry list of Web apps under development at Frontier runs the gamut from order-entry systems, to inventory control, to customer care and billing, and everything in between. To cope with the varied testing requirements demanded by these increasingly interdependent Web systems, Flow says he's had to redefine the role of QA engineer.

"My QA engineers have a dual role," Flow explains. "They play unit testing, and they play integration QA." Flow says he integrates QA engineers directly into the development process from day one. This is so they can understand the user requirements and application specs, and ensure the code delivers.

Once the development team declares a unit "code complete," Flow has his QA testers switch gears. "When [the developers] say something's complete, my QA engineer has to change his hat from a unit tester to an integration tester. That's where the automated tools come in."

Multiple points of failure

For example, Frontier's Inventory Management System (IMS) is the needle's eye through which five other enterprise Web systems are threaded. Virtually every conceivable interaction between these business-critical, codependent application systems must be rigorously tested.

IMS manages the company's total inventory and, as such, is depended upon by virtually all the other major application systems. There are five different Web-based apps that rely on it, including the company's product configurator, order entry system, work-flow engines, billing system, and more.

"We used to just test the GUI," Flow says. "Does the app ask the right questions, do the forms work, is the data saved, and so on. But now we have to make sure every app hits IMS, and that the data interacts properly with other apps in the dependency chain. This is where integration testing takes over, and why we had to find an automated tool."

Because the complexity of the integration testing was beyond human means, Flow's group turned to automation; specifically, Compuware's QA Director. The product allows multiple applications and databases to be scripted and executed simultaneously--a key feature for integration testing.

Flow's team uses QA Director to hit upon all the applications and their databases and to generate reports that flag errors in system interactions. "Application A touches application B, and B hits C and D, while E might hit A," Flow says. "QA Director can actually manage this kind of elaborate test."

Now, whenever Frontier prepares to issue a new release, it is first subjected to an entire integration regression test suite, managed by scripts running under QA Director. This ensures previously tested functionality is unchanged, and validates the accuracy of new features.


Page 3 of 5

Previous Page
1 2 3 4 5
Next Page





0 Comments (click to add your comment)
Comment and Contribute

 


(Maximum characters: 1200). You have characters left.