In the high-tech worlds of corporate computing, networking and data storage, fake suddenly is in.
No, we’re not talking about nose jobs, Botox, tummy tucks or liposuction. And please, get your minds out of the gutter—we’re certainly not speaking of breast implants (at least not here).
Instead, the trend du jour is virtualization, a broad term that describes the abstraction of computer resources to some degree. Using virtualization, companies can make a single physical resource (a server, operating system or storage device) appear to function as multiple resources, or they can make multiple resources (such as servers or even individual PCs) appear to function as one.
The phenomenon certainly is on the rise. Gartner deemed virtualization a “megatrend” in 2005, and a Forrester Research report from 2007 indicates that 51 percent of more than 1,750 companies said they are testing the strategy or have already deployed it, up from 41 percent in 2006.
While virtualization zealots allege that the strategy cuts costs and is good for the environment, opponents say it creates huge overhead, and is dangerous in how it centralizes resources. Tim Mueting, solutions manager at hardware vendor AMD, said that however companies perceive virtualization, there’s no disputing the strategy can be worthwhile.
“When you consider that most servers are running at 10 to 15 percent utilization, at the very least virtualization is a way to start consolidating and get better utilization out of technology,” he said. “In the end, the strategy enables companies to manage their network environments in a much more dynamic way.”
How it Works
Ask 100 different people to define virtualization and you’re bound to get 100 different answers. While some products enable virtualization, virtualization itself is not a product but rather a strategy. The strategy creates an external interface that hides underlying implementations. Put differently, it creates an artifice, or artificial reality.
Within this façade, virtualization comes in many sizes and shapes. Platform virtualization involves the simulation of whole computers. Resource virtualization involves the simulation of combined, fragmented or simplified resources. However you look at it, virtualization is an exercise in optimization, and it makes software and hardware run more efficiently.
In the scheme of things, virtualization is nothing new; the computing approach has been around in one form or another since the 1960s. Some say the phenomenon dates back to the IBM M44/44x system, which was dubbed a “virtual machine.”
In the old days, virtualization only existed in the data center, where network administrators could partition storage servers and make many from one. Today, however virtualization also can enable software that was written for single core processors to be moved onto multi-core processor without rewriting the entire application stack.
Companies in the space are plentiful. VMware, which is operated as a subsidiary of TK-based EMC, is the market leader. Nearly 53 percent of respondents in the Forrester report said they would consider VMware for virtualization, versus just 9 percent for the Virtual Server from Microsoft.
There are scores of other virtualization options too—smaller entities that have made the trend more affordable by “productizing” it. VirtualLogix, for instance, a software firm in Sunnyvale, Calif., incorporates virtualization into connected devices such as mobile phones and set-top boxes. There’s even open-source virtualization software named Xen.
| Virtualization Services Market, 2007-2011 Forecast
Many small and mid-sized businesses turn to virtualization for the economics. Because they are replacing multiple servers (or other pieces of hardware) with one, hardware expenditures amortized over time drop considerably. Some businesses report a savings of more than 40 percent after two years of virtualization. Others have seen numbers even more stunning than that.
Virtualization reduces costs in another area: energy. Because companies that virtualize require less hardware, they use up less energy to power their technology, and less energy to heat and cool it. Dawn Wells, senior product manager at Verio, a hosting company in Centennial, Colorado, said the economics speak for themselves.
“You’re getting control and cost savings without having to spend thousands of dollars a month as you would if you bought your own stuff,” said Wells. “Who would complain about that?”
At a time when everyone’s consciousness about global warming is rising, another benefit of virtualization is a smaller environmental footprint. This is a natural extension of utilizing less energy; it also has to do with the carbon footprint a company reduces by requiring fewer pieces of technology (hardware takes energy to build, you know).
One final benefit to virtualization is agility. Because virtualized servers are more efficient, they improve flexibility and enable companies to scale and respond to customer demand quickly. Ute Albert, marketing manager for virtualization at HP, said this expedites the speed of deployment across the board.
“When you need to get more resources to an existing workload or bring a new project online, this really is a benefit,” she said. “I don’t care what industry you’re in—there are a lots of examples where faster time to market counts.”
Still, virtualization isn’t perfect. Because the strategy introduces complexity into a network, many implementations require additional system resources to run the processes. Virtualization experts refer to this as “overhead,” and note that in some cases, virtualizing requires significant additional investments in bandwidth to ensure the network functions properly.
Ken Salchow, manager of technical marketing at F5 Networks, a networking company in Seattle, said that depending on the implementation, investing in an entire layer of these resources could become a critical part of successful virtualization.
| Virtualization Services Market, 2007-2011 Forecast
“Whatever the network infrastructure, primary applications need network resources to work the way they’re supposed to,” Salchow said. “Businesses need to focus on these resources; simply virtualizing is not enough.”
Another potential pitfall: losing track of the virtual infrastructure. Jonathan Bryce, founder of Mosso, a hosting company in San Antonio, Texas, said that because it’s so easy to create new resources using virtualization, companies need to take a disciplined approach to provisioning and monitoring virtual machines.
Perhaps the final challenge is total system failure. Because virtualized companies run multiple technologies off one device, failure can wreak havoc on a whole segment of a network. This can cause isolated service disruptions—a reality that Andrew Barnes, senior vice president of corporate development at Neverfail (Neverfail, a disaster recovery company in Austin, Texas, said businesses must recognize.
“One of the biggest misconceptions regarding virtualization is that true disaster recovery and high availability are built into the products offered by the major virtualization software vendors,” said Barnes. “The fact is that companies need to handle much of this on their own.”
Despite these challenges, experts say affordability should keep virtualization around for a while. Numbers from Gartner and Forrester suggest that businesses are still warming to the idea, and one can only imagine more small- and mid-sized businesses will embrace the strategy as they push toward “greener” operations in the months and years ahead.
Still, there is room for improvement. A recent study by Novelland Lighthouse Research found that while 45 percent of 411 respondents had implemented virtualization technology, few were using automated management tools to improve efficiency and resource utilization in the data center.
Ben Rudolph, a spokesperson for SWSoft, a virtualization company in Herndon, Va., said that businesses need to remember that virtualization is not a “miracle drug” for any business, and that without proper foresight and management it could become more of a burden than a boon.
“If you move from 50 real servers to 5 real servers with 50 virtual servers, you’ve actually complicated your infrastructure,” he said. “Good planning and good management tools can help any business clear the confusion and really get its hands around its virtual deployment so it can really optimize its infrastructure.”