Editor’s note: This article is part of an IT Management Special Report on infrastructure and outsourcing.
Today’s world of managed services and hosted applications is moving towards an IT world where computing power is available on demand from a public utility-like grid.
It may seem like exotic, bleeding-edge technology, but grid computing is poised to replace today’s world of managed services and outsourced applications.
That’s the view of industry observers like IDC analyst David Tapper, who believes that eventually, computing power will be available on a public utility-like grid, much as electricity or water are today.
Don’t hold your breath waiting for that to happen, however. While Tapper and other experts, such as Eric Broda, vice president of services at grid software vendor Platform Computing, believe the arrival of public-utility computer grids is inevitable, it’s not going to happen tomorrow.
“If I were to use my crystal ball,” says Broda, “I’d say we’re looking at the 2006 to 2008 timeframe.”
In the meantime, however, companies are already beginning to experiment with various approaches to utility, or on-demand, computing.
The use of grid computing by itself is no longer a rarity, with a number of companies turning to vendors like Platform, or to the open source Globus grid toolkit, for software to build grids. Most of these efforts, however, have remained firmly within the four walls of a single organization.
The bio-tech firm Incyte Genomics, of Palo Alto, Calif., for example, is using an in-house grid made up of 1,200 Linux and Unix servers to develop new drugs.
But now, grid computing is beginning to spill out over the boundaries of individual organizations. A few organizations are creating what Broda calls “partner grids,” entering into arrangements where they share computing resources, and in some cases, data and applications, with other groups.
One chip manufacturer which uses Platform’s software makes its internal grid available to other companies who are collaborating on chip designs, says Broda. “The company allows its design partners to go in and modify the chip designs and then test them using its own computing power,” he says.
There are other forms of partner grids as well. A breast cancer research grid based at the University of Pennsylvania, in Philadelphia, will connect the University’s hospital with research hospitals in Chicago, North Carolina, and Toronto. The grid, which is being built using the Globus Toolkit, will allow physicians to view and analyze digitized mammograms. The tremendous amount of computing power available on the grid will enable it to both handle the large mammogram image files, and run sophisticated algorithms on large data sets, looking for cancer “clusters” in a particular community.
Private utilities
Other companies are going even further, taking an approach to grid computing which IDC’s Tapper calls a “private utility.” Financial powerhouses Deutsche Bank and JP Morgan Chase, for example, recently announced large outsourcing deals with IBM which call for Big Blue to supply extra computing power when the Wall street firms outstrip their own computing power.
IBM is making a concerted push in the direction of deals like these, which it terms on-demand computing.
The arrangement aims to let the two financial firms handle surges of demand for processing power, while still keeping their computers in-house, explains Tapper. “When they need excess capacity, they can go out and reach into IBM’s resources,” he says.
One of the key requirements for the success of services like this is safeguarding the security of data on the grid. While the technology to do that is available today, says Broda, many users remain cautious. “It’s a perception issue,” he says. “People are not going to share their data unless they have absolute confidence in the security and privacy of the environment.”
There are other requirements as well, such as rapid provisioning. The utility must be able to provide extra processing power on-demand, says Broda. “If I go to my utility and say I need an extra ten CPUs, it can’t take weeks to get them,” he says. “It has to be provisioned nearly immediately.”
Vendors hoping to provide utility computing services are starting to think in these terms, Broda says.
And as the vendors become more practiced at providing the service, the private utility model is bound to grow, according to Tapper.
“It’s a natural extension of outsourcing,” he says. “First, customers manage it themselves. Then when labor costs increase and products become commoditized, they begin looking for others to take it over. The easier it is to do, the more likely you are to get someone else to do it for you, especially if it’s not your core competency. It’s like water — why run your own water plant when there’s someone else who specializes in that?”