As 2008 draws to a close, virtualization remains one of the few bright spots in a year mired in financial debacles, a deepening recession and credit crisis that doesn’t seem to be going away any time soon. Virtualization has become a shelter of sorts in the rapidly brewing storm of economic uncertainty.
In 2008, virtualization seemed to have an unstoppable boom of its own. It has, for all intents and purposes, moved into “killer app” territory. This is unprecedented given that x86 virtualization is barely out of its infancy. In 2007, it went from being a technology vendors needed to explain to enterprises and sell them on, on to a technology everyone asked for. In 2008, however, virtualization became a technology whose capabilities were expected from the chip level on upward. It helped, of course, that Microsoft got behind virtualization with its release of Hyper-V in late June.
So what will 2009 bring? Will virtualization be able to maintain its location within the eye of the growing economic storm? Perhaps, but that doesn’t mean the winds of change won’t blow it around. That’s already begun.
1. VMware Passes the Reins
Nowhere was VMware’s view of itself more clear than at this year’s VMworld in Las Vegas. From the vendor’s end-to-end roadmap of what it foresaw for data center to the way it facilitated the show, the message was clear: We are VMware and we are king.
However, as its message that “Virtually Anything Is Possible” was being broadcast through the Venetian Resort-Hotel-Casino and beyond, another message was being heard. The sound of Lehman Brothers crumbling. And then AIG spinning, followed by Merrill Lynch being bought in a fire sale.
And it’s not just market forces making life difficult for VMware. While the self-proclaimed monarch was busy planning its coronation, another self-proclaimed ruler was releasing the results of its vision, and that vision is free.
Hyper-V landed on the scene after a few months of delay. Hyper-V at this point is only a hypervisor, but it’s an integral part of Windows Server 2008, and it’s basically free.
And when Wall Street, Main Street and everything in between is melting down, free is good. Especially if the product is good-enough and it’s got Microsoft behind it.
Financial institutions are typically the early adopters for emerging technologies. They are willing to pay big bucks for products that deliver a return on investment of bigger bucks. With budgets frozen or contracting, and a free product on hand, it will be VMware, not virtualization, that will be in question.
Thus, in 2009, VMware will likely have to work even harder to maintain its lead. At present it has about a 40 percent share, the largest by far of any vendor. With the market still largely untapped, it’s sure to lose at least some ground to Microsoft.
Another sign of Microsoft’s expanding presence is from the ISVs that are developing products for Hyper-V as well as VMware ESX. In the past, ESX was the baseline. Now ISVs are either releasing products compatible with both, akin to Windows and Linux editions, or prioritizing Hyper-V right behind ESX.
VMware will also face increased competition at the other end of the spectrum. CA, Oracle and Symantec have a multitude of products, many of which compete directly with VMware’s offering. VMware’s end-to-end plan, puts it directly in the these players’ arena. There will be blood. There will be consolidation.
2. A Breach in the Levy
Doing anything on the basis of “as cheap as possible” always has ramifications, and virtualization is no exception.
For some time analysts have predicted a security breach would bring the public’s attitude toward virtualization back down to earth. Surprisingly, that has not yet happened. As the number of virtual deployments increases, however, so does the likelihood. Especially given the current economic climate, where many enterprises are thinking in terms of short-term savings and virtualizing without proper process.
It may be a perennial prediction at this point, but with as companies virtualize as quickly as they can to save money on equipment, mistakes will be made. These mistakes will range from virtual machine sprawl to wrongly provisioned hardware (most likely resulting in downtime) to gaping security holes.
Expect to read about some horror stories. To avoid becoming a case study for one, plan properly and don’t cut corners, especially when it comes to security.
3. Cloudy Skies
Love it or hate it, cloud computing doesn’t seem to be going anywhere but up. For cash-strapped enterprises, cloud is a much more desirable solution than inappropriate hardware, as it offloads the heavy lifting.
Amazon’s cloud is just the beginning. IBM has a cloud initiative in the works, and expect to see many more in 2009. For some enterprises, building an internal cloud will be the way to go.
Cloud computing isn’t terribly new. The concept goes back several decades and has gone by a multitude of names, including utility computing and supercomputing. Its present incarnation has been gaining in popularity for about two years now.
Hosting providers, however, are well-situated to reposition themselves to offer clouds. Expect to see a lot of noise around that.
Not every application is suitable for the cloud, but not every app is suitable for virtualization. In general, mission-critical apps should not be put in a cloud, at least not initially — unless you’re willing to bet the business on them.
4. Wireless, the New Frontier
The excitement has long left the PC vs. Mac arena. In 2009, it will be all about Blackberry vs. iPhone. Successful data centers will take this into consideration, and when planning their virtual deployments, will look beyond traditional clients. Netbooks and smartphones will be increasingly sought after in the server room next year chiefly because of their price and flexibility.
Consumer devices are finding their way in through the back door, and enterprises that figure out a way to leverage them securely will have the upper hand in terms of both virtualization and general infrastructure management.
Amy Newman is the managing editor of ServerWatch. She has been covering virtualization since 2001.
This article was first published on ServerWatch.com.