Stressed out from stress testing: Page 3

Posted November 1, 1999
By

Rich Levin


(Page 3 of 3)

It's not the kind of thinking normally associated with IT organizations, which historically deal with internally driven business requirements that take months to refine, development cycles that are measured in years, and application systems that are built to span decades.

Examine the cycle time of any e-business shop, though, and you'll find development lifecycles pegged to monthly, weekly, daily, even hourly release builds.

"IT used to operate under the notion that you write perfect requirements, and everything flows from there," says Sam Guckenheimer, the Lexington, Mass.-based senior director of automated testing products for Rational Software Corp., in Cupertino, Calif. "But the Web is iterative. The requirements change daily. We've seen some dot-com organizations with six-hour release cycles."


William Flow, manager of software quality assurance for Frontier Corp.: "The revenue for our company is based on these Web-based apps."

E-business developers agree, and add that compressed application lifecycles aren't the only challenge in maintaining high-quality customer-facing e-apps. The architecture of Web-based systems is inherently complex, heterogeneous, and fragile.

"We might do five releases in three months," says William Flow, manager of software quality assurance for Frontier Corp., in Rochester, N.Y. "And it's not just the Web client we're revving. We have to test all the stuff--integration, databases, other Web apps we hit, e-mail systems--it can be hundreds of things. It's become impossible to do it manually."

Frontier personifies the diverse community of platforms many enterprise organizations struggle to integrate as they fight their way to the Web. The company's architecture is a mix of mainframes, with Solaris UNIX and Linux IP servers running on Intel, SPARC, and UltraSPARC machines.

Automating quality

Born over 100 years ago as Rochester Telephone, a small local telco, the company started buying up small Baby Bells after the AT&T breakup. A long string of acquisitions later, Frontier emerged as the country's #5 domestic long-distance carrier.

For the past two years, Frontier's IT organization, under the leadership of CEO Joe Clayton, has been charged with aggressively moving the firm's entire computing infrastructure to the Web. The initiative is dubbed TMN, for Telco Management Network.

The e-business programming team at Frontier is turning to automated testing technologies to maintain the highest service levels of the firm's customer-facing apps. The reason: With literally all of the company's systems headed for the Web, a failure of any company system means wasting money.

"There was a time when application downtime didn't directly impact revenue," Flow says. "Today the revenue for our company is based on these Web-based apps. If any aspect is down or not doing its job properly, I'm losing revenue."


Linda Hayes, Datamation's Quality Quest columnist, wrote: To win at software development, change the game. You've got to discard old-fashioned methods and mindsets, she says, and come up with new ways to create good software.
The laundry list of Web apps under development at Frontier runs the gamut from order-entry systems, to inventory control, to customer care and billing, and everything in between. To cope with the varied testing requirements demanded by these increasingly interdependent Web systems, Flow says he's had to redefine the role of QA engineer.

"My QA engineers have a dual role," Flow explains. "They play unit testing, and they play integration QA." Flow says he integrates QA engineers directly into the development process from day one. This way they can understand the user requirements and application specs, and ensure the code delivers.

Once the development team declares a unit "code complete," Flow has his QA testers switch gears. "When [the developers] say something's complete, my QA engineer has to change his hat from a unit tester to an integration tester. That's where the automated tools come in."(see: Stressed out from stress testing)

Multiple points of failure

For example, Frontier's Inventory Management System (IMS) is the needle's eye through which five other enterprise Web systems are threaded. Virtually every conceivable interaction between these business-critical, codependent application systems must be rigorously tested.

IMS manages the company's total inventory and, as such, is depended upon by virtually all the other major application systems. There are five different Web-based apps that rely on it, including the company's product configurator, order-entry system, workflow engines, billing system, and more.

"We used to just test the GUI," Flow says. "Does the app ask the right questions, do the forms work, is the data saved, and so on. But now we have to make sure every app hits IMS, and that the data interacts properly with other apps in the dependency chain. This is where integration testing takes over, and why we had to find an automated tool."

Because the complexity of the integration testing was beyond human means, Flow's group turned to automation--specifically, Compuware's QA Director. The product allows multiple applications and databases to be scripted and executed simultaneously--a key feature for integration testing.

Flow's team uses QA Director to hit upon all the applications and their databases and to generate reports that flag errors in system interactions. "Application A touches application B, and B hits C and D, while E might hit A," Flow says. "QA Director can actually manage this kind of elaborate test."

Now, whenever Frontier prepares to issue a new release, it is first subjected to an entire integration regression test suite, managed by scripts running under QA Director. This ensures previously tested functionality is unchanged, and validates the accuracy of new features.

Risk avoidance

Flow says that, without the availability of Web-savvy integration testing tools, Frontier's entire application portfolio would be at risk. "If we didn't have these automated tools, we simply couldn't do the testing," he says. "We'd be in a world of hurt right now."

Certainly automated testing tools can ease the pain of integration testing and help ensure a site's ability to withstand heavy user loads, the likes of which no legacy IT app has ever been asked to sustain. But not one of the automated testing tools available today can replace the need to beta test, using qualified users culled from the application's target audience. It's the only engineering process known that can isolate bad user interfaces.

"As a [testing tool] vendor, I hate to say that our tools can't perform a certain function, but the truth is, usability testing is the one thing no automated tool can do," says Diane Hagglund, senior manager for e-business product marketing at Mercury Interactive.

Hagglund says usability testing might never be automated, because it has to do with responding to human emotions--something that has yet to be computerized. "We're seeing more and more traditional IT shops doing what ISVs would call beta testing, under the guise of usability testing," she says.

That's exactly what's happening at Acentris Wireless Communications, a telco services reseller in Seattle. There the beta test process has been integrated into the overall development lifecycle, with a core group of developers, internal users, and customers comprising Acentris' beta test team.

The company recently migrated from its legacy Microsoft Visual Basic 4 (VB4) client/server system to a fully distributed platform. The new system is built in VB6 and leverages several beta technologies itself, including a COM+ framework and Windows 2000 Beta 3 RC1 servers.

"We prototyped the Web UI first, and sent it out to a small group of customers and internal users for beta testing," says Acentris VP Darren Lang. "That gave us a huge head start, because we were able to fine-tune the user experience and hand the UI off to the programmers early in the development process."

Acentris' development team was then free to focus on the migration's nuts and bolts, and use automated tools to stress and regression test the application architecture, knowing usability was already in hand.

"The reputation of the IT department no longer rides on how well they manage the printers, back up the servers, or get a new PC on your desk," says Michael Marquardt, president of Internet Operations Center Inc., an e-commerce application hosting company in Southfield, Mich. "It's now the software development arm of the business, and that means we need to think and act more like ISVs, and less like islands of technology." //

Rich Levin covers IT for CBS Radio and the Coast to Coast Radio Network. He can be reached at RBLevin@RBLevin.net.



Page 3 of 3

Previous Page
1 2 3
 





Comment and Contribute

 


(Maximum characters: 1200). You have characters left.