Tuesday, December 10, 2024

Web Apps’ Missing Link: Acceleration

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Tech analysts call application acceleration the missing link in the ongoing bugbear of slow-loading Web sites, and predict even more bottlenecks as enterprise networks prepare to distribute even more applications over the Internet.

Application acceleration? It’s the latest evolution of dynamic caching, which refers to the process of offloading information requests between servers in order to make way for more dynamic calls, all in the goal of improving speeds of delivery to the browser request. In short, they tame applications from chewing through networks’ server capacity, yet accelerators remain a niche sector.

But as applications are increasingly distributed over the Web, and more complex forms of information exchanged — amid the talk about Web Services, and “on-demand” computing, application acceleration has become the new answer to a persistent problem: Enterprise applications are bandwidth and CPU hogs.

“When you start delivering Web-based applications and put them on a browser, then you have a presentation layer issue,” explains Michael Hoch, an analyst with tech research firm Aberdeen Group.

“In a client-server environment, the client can do a lot of the presentation work and not affect a server at all. But in any Web-based application, you start having presentation issues. Because not only does another device need to get the data, but you have to decide what form it needs to be presented in as it’s being transferred across the network.”

Enter the application accelerators, whose software and hardware devices help ease the heavy loads that distributed applications place on multiple servers with each click of a mouse.

“The cache can be set between all points in a Web process, such as the database server, the application server and the Web server,” says Hoch. “Or (the caching software) might even run in front of a Web server so you can offload the requests that have to go back and forth between the different elements. The effect is the load on the server is less and the speed at which the information is served is much higher.”

Hoch notes about a half dozen or so companies that increasingly specialize in enterprise application acceleration.

One of them is New York-based WARP Technology Holdings. The company’s officials sum up their sector this way: enterprise applications are fundamentally CPU hogs. So what do enterprises do? They throw more big iron at the problem by installing more application servers.

Greg Parker, chief technology officer for WARP, who joined the company when it acquired dynamic caching firm Spider Software, says the purchase of new application servers then leads to more pricey application server licenses.

For example, he says a manufacturing company that deploys supply chain applications for 15,000 users will spend about $3.4 million for the application server licenses and $5 million for hardware.

“Eighty percent of the software and hardware costs are required because of the transactional nature of the systems, and the sheer number of licenses required to support the user base,” he says.

To distribute applications, say, between corporate offices in Tokyo, Houston and London “you need to have multiple databases all in sync and have multiple application servers for load balancing. To do that with
traditional database technology is extremely difficult, time consuming and costly,” not to mention the costs of infrastructure and administration.

WARP’s response to the problem is a (much cheaper, of course) new software/hardware product called the “2063 v2 pre-processor,” which Parker and Chief Executive Karl Douglas say can reduce server loads and increase throughput by 600 percent.

WARP 2063 is the lead component of the company’s GTEN (Global Transaction Enabled Network) architecture, its framework for distributed application acceleration. The company, which built its reputation providing dynamic caching for financial services firms (the bulk of New York’s IT industry), reckons that over the next two years, more than 55 percent of Fortune 1000 companies will start to move their key corporate applications onto the Web.

That means they’ll be demanding more of servers with applications such as customer relationship management, supply chain, financials, and OLAP (online analytical processing, a key component of data mining).

As companies grapple with the processing issues involved with distributing applications, WARP officials and other players in the niche space see a renewed opportunity for their products and services.

Hoch also expects to see major software and technology companies such as Oracle, IBM, and BEA adding more dynamic caching capabilities in their database products in years to come.

But for now, niche companies in the sector include Chutney Technologies, which offers object sharing and persistence software for enterprise applications, and Persistence Software, which makes mapping, caching and synchronization technologies to reduce inefficiencies in query-intensive online sites.

Spider Software is another dynamic caching company covered by analysts that follow the sector. WARP Technologies is in the process of acquiring it. Companies with similar or ancillary caching products are Akamai, Network Appliance and Cisco. After all, these players have been around enterprise applications long enough to know how well they chew through server capacity.

Peter Christy, an analyst with Internet-focused NetsEdge Research Group, is also keen on application accelerator software firm F5 Networks. “We like the way in which F5 continues to patiently bridge the gap between the application and the network — helping their customers get the most out of Internet network connectivity (which is a much bigger opportunity than just helping on Web site access),” he wrote recently on the firm’s Internet Acceleration Web log.

“The proof of this pudding, of course, is in the customers that cross that bridge, and what business solutions they bring along. At least, it looks like we can count on them being around long enough to get that answer.”

Still, despite the chilly environment for IT sales, Christy estimates that Web Data Center infrastructure products and services will represent about a $1 billion market by 2007.

And despite the continued decentralization of applications as they are moved to a Web browser, Aberdeen’s Hoch notes an interesting paradox at play in many enterprise networks. At the same time computers and devices are dialing into a network remotely, Hoch says his research shows that IT buyers are also centralizing and consolidating their data centers.

“They want everything in a more centralized location (such as) two main data centers and to then deliver things over the Internet to the users.”

The bottom line with Web-distributed applications, analysts add, is that enterprise developers are still learning their way through the issues that are created when mainframe legacy systems meet new networking protocols.

“I expect that over time, this sort of technology will integrate into Web servers,” Hoch says. Even as major database companies such as IBM and Oracle move to provide these accelerators in coming years, Hoch and Christy expect the current crop of application accelerators to play a part in Web Services, such as processing SOAP packages among trading partners.

“But for now,” adds Hoch, “application acceleration is a difficult thing to do. So for the near term, it bodes well for the niche players.”

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles