Stressed out from stress testing: Page 2

Posted November 1, 1999
By

Rich Levin


(Page 2 of 3)

The new requirements called for transforming Avis' legacy enterprise architecture from one that served a handful of internal agents, to a publicly accessible fleet management system that thousands of Avis clients could access live, direct from their desktop PCs.

Lessons learned about automated testing technologies
  • Think like an ISV. Development in the e-business age requires IT organizations to conceptualize systems as commercial apps. To maximize application system quality, adopt proven best practices long used by successful ISVs, such as heavy use of automated testing tools, technologies, and development methodologies.

  • Test early and often. Integrate your QA testing team into the development process from day one. The more the testers know about the application requirements, coding, and resulting app, the better they'll be able to devise test scenarios and regression scripts.

  • Prototype the user interface first. Send the resulting user interface (UI) demo out to customers, partners, and other users for early feedback. By defining the UI early, the IT team limits show-stopping usability bugs late in the development lifecycle.

  • Don't overlook load testing. Applications that perform well under low-to-moderate loads can behave unpredictably when stress scales up. Use an automated load-testing tool to stress application architectures two or three times beyond what you expect the actual user load to be.

  • Test the application's technology as well as its logic. Your application code might be robust, but a single bug in your vendor's app server or OS can bring your organization to its knees.

  • Build a multidisciplinary QA team. Software quality assurance today involves far more than GUI regression and load testing. Web apps are inherently complex, leveraging multiple new technologies and heterogeneous legacy systems. QA teams need to test integration points, database integrity, object messages and communications, and more.

  • Evaluate all the major test tool product lines. Most vendors have completely updated their offerings to address the requirements of e-business development efforts.

  • Have contingency plans. Be sure your IT team is prepared for a major system failure, and have a disaster plan in place to bring systems back up as quickly as possible. Test the contingency plans often with drills.
  • Ambitious from the start, Avis aimed to do more than just surface marketing and reporting schemes on the Web. The new architecture would touch literally every customer, every driver, and every one of the 350,000 vehicles in the company's corporate rental fleet.

    The first application targeted was a huge vehicle fleet maintenance solution, where clients could go online and manage vehicle histories, repair costs and operational expenses, driver profiles, and safety training reports, as well as analyze accident records.

    It was a high-velocity U-turn for Avis. The company historically relied on its customer call centers to support client queries by telephone, with monthly reports generated by computer and delivered to fleet managers by snail mail.

    "When we started this effort, we were a principal vendor, but not a principal app on the fleet manager's desktop," Lutz says. "Today when they manage their fleet, we are the principle application they use. We're no different from Microsoft [Corp.] or Sun [Microsystems Inc.] in that regard. We have become a mission-critical software provider."

    Critical mission,
    critical testing

    Lutz's team decided the shift from mission-critical vendor to mission-critical ISV called for mission-critical testing. It was a fortuitous call: had the company not applied automated stress-testing technologies, the system would have collapsed on its first day online.

    "The application just wouldn't work under load," Lutz recounts. "It ran great with a small team of QA testers exercising it, but when we applied the load-testing software, it wouldn't work."

    Using the LoadRunner load-testing software from Mercury Interactive Corp. of Sunnyvale, Calif., Lutz's team was able to simulate 10,000 concurrent users banging away on the app. The problem was traced to a bug in the ColdFusion app server from Allaire Corp. of Cambridge, Mass.

    After Allaire issued a patch, the system was again subjected to a round of load testing. This time, it passed. As the development process ensued, Lutz's team brought more customer data online, eventually facing Avis's entire 600GB data warehouse to the Web.

    Each step of the way, the team repeatedly threw app modules into a load-testing pressure cooker. Lutz says that, at the time, without Mercury's LoadRunner product, this extreme level of load testing would have been beyond the realm of possibility.

    That's because most load-testing products required pools of PCs. The server-based LoadRunner required only one central server.

    "Typically with client/server or Web stress testing, you have to drive it from multiple PCs," Lutz explains. "I would have had to commandeer entire buildings of PCs. It would have been impossible to test the kinds of numbers we're talking about."

    In terms of predictability, the testing results have been "right on," Lutz says. The Avis site zoomed to an average of 90,000 hits per day soon after it was deployed, with a one-day peak of 141,000.

    "We've had no problems due to load," Lutz says. As PHH gets set to scale the site again with a new development phase to open the app to more online customers, the group has purchased an additional LoadRunner license to enable scalability testing beyond 30,000 simulated users.

    "No architecture can scale infinitely," says Lutz. "Every time you add or change something, you have to test. We want to take it up further, and stay ahead of the curve as far as numbers of users. Load testing lets us do that."

    On the razor's edge

    Spurred on by extreme market pressure, companies are forced to stay ahead of the curve. They must relentlessly update, upgrade, rev, improve, add features, embrace trends, and technologically innovate within the context of the software development process.

    The hallmark of any successful ISV that needs to ship shrink-wrapped product to a fickle enterprise marketplace is the aggressive release mentality one would expect to find at Microsoft, Sun, or Red Hat Inc.




    Page 2 of 3

    Previous Page
    1 2 3
    Next Page





    Comment and Contribute

     


    (Maximum characters: 1200). You have characters left.