Thursday, March 28, 2024

Virtualized Servers: Less Work or More?

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

SAN JOSE, Calif. — The datacenter of the future may be at least partially virtualized, but the consolidation of hardware does not mean less work. On the contrary, more virtual servers still means more servers to maintain, increasing the burden on IT, not decreasing it.

That was the warning from a session here at the IDC Directions ’09 conference, where the market and technology research giant is meeting with customers and clients to discuss strategies and planning for the coming years, including how to weather the economic storm blowing over the U.S.

The emphasis on cutting costs is increasing annually, noted Michelle Bailey, research vice president in IDC’s datacenter trends and strategies group. She said almost 40 percent of IT managers surveyed by IDC said that cost savings was their primary concern.

Virtualization has often been seen as something of a magic bullet to this problem, promising to consolidate a number of low-utilization servers onto a single piece of hardware. But the average number of virtual machines per server is only five, Bailey noted, with that number going to eight by 2012.

So much for the vision of consolidating dozens of servers onto one machine. More important, though, was that IDC found that just going from five virtual machines to eight means there will be 100 million new servers by 2012, and “all of them still need to be managed.”

That’s a problem, she said, since the tools to do this are not keeping pace.

“If spending on tools doesn’t pick up, we’re going to see a stall in the virtualization market, because there’s no way you can manage all this with the people we have,” she said. “Automation is key.”

Smarter datacenters

IDC also said the enterprise datacenter is shrinking — not so much due to virtualization, but because of the shift to outsourcing. Enterprise datacenters will shrink from 77 percent of the datacenter market today to 65 percent by 2012.

Scale-out datacenters, where enterprise computing jobs are hosted and farmed out by companies, will grow from 9 percent to 16 percent, and small site datacenters, which will serve smaller markets and firms, will grow from 14 percent now to 19 percent in 2012.

As a result, datacenter design needs to get smarter and not just built with massive overcapacity — a common problem. Instead of building a 100,000 square-foot datacenter and using just 5,000 square feet, build it as a 5,000 square foot modular design, and add on capacity in small, repeatable increments, IDC said.

Changing face of hypervisors

Another track discussed the changing position of the hypervisor. Virtualization’s now no longer just the preserve of large companies with massive amounts of servers to consolidate. Instead, it’s also found its way into midsized datacenters, according to Al Gillen, research vice president for system software at IDC.

Hypervisor prices have gone to zero, mostly thanks to Microsoft (NASDAQ: MSFT) giving away its Hyper-V, and the rest of the industry following suit. So it’s no longer about the cost of the hypervisor, not is there much of a market for one any more. Rather, the value-add has moved to management tools, IDC said.

“A few years from now, it won’t matter whose hypervisor you have in place, but what those tools can do with the hypervisors,” Gillen told an audience during his discussion.

A sign that the industry is already acknowledging this came from Microsoft, which, Gillen said, did something very atypical with System Center Virtual Machine Manager: It manages competing software like VMware’s (NYSE: VMW) and Xen’s hypervisors.

“Who would have thought Microsoft and Red Hat could talk about interoperability at a hypervisor level, he told his audience.

The value and importance, then, is in systems management software — although they still need improvement, he added.

For instance, to be truly effective, systems management tools need the ability to manage online and offline system images, and has to understand the operating system and workloads and be able to provision accordingly, Gillen said.

Other must-have features for systems management software include self-provisioning, maintaining service levels and supporting system-oriented architecture commitments, redundancy, resiliency and recoverability — and they must be cost effective, he added.

This article was first published on InternetNews.com.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles