The idea of virtual data centers, where the vast infrastructure of storage, servers and networks can be managed as one, has been around since IBM accomplished essentially the same thing on the mainframe nearly 50 years ago. It’s taken decades to achieve, but a few companies today may be close to matching IBM’s feat from the mainframe era in modern data centers.
IBM’s pioneering work in mainframe virtualization was an inspiration for VMware’s launch many years later. And just as IBM virtualized what was then the entire computing environment – the mainframe – so today companies like VMware, Citrix and Red Hat are trying to do the same thing across the distributed data center infrastructure of servers, storage and networks.
The pioneering work by IBM engineers Robert Creasy and Les Comeau in 1964 gave each mainframe user a virtual machine (VM), a self-contained operating environment that allowed many applications to run simultaneously in the same mainframe. Fast forward to today, where a new generation of upstarts is attempting to do the same thing with modern distributed data centers by separating the operating system and application layers from the entire underlying physical infrastructure.
In effect, the new software-defined data center management layer becomes something like a PC operating system for the entire data center, masking if not simplifying the underlying physical infrastructure while making sure that applications get the resources they need to run.
As computing shifted from mainframe to distributed environments, the virtualization problem became more complex, but the solution is essentially the same. It took decades to evolve, from job scheduling and root-cause analysis to run-book automation and provisioning, configuration and change management, but a few companies are close to assembling the equivalent virtualization technology in today’s data centers.
Taken to its logical extreme, next-generation data centers could pool all data center hardware and let applications run according to policies like importance and timeliness and security, running applications on the most appropriate physical resource regardless of where it’s located. Storage and network hardware could become as commoditized as servers have been for some time. Self-service IT – where users essentially launch their own apps and services at the click of a mouse – could become a reality. Specialized jobs like storage, network and server admins could disappear or become less important if everything can be run from a single console.
The truth will likely be somewhat less than the vision, as it always is. Technology gets more complex even as users wish for simplicity, new technologies arise over time, and there are always unintended and unforeseen consequences. But IT departments could be about to change dramatically as a result of the rise of the software-defined data center.
VMware’s virtual vision
Perhaps the company most in the lead in the software-defined data center gold rush is VMware. The 15-year-old company was acquired by storage giant EMC for $625 million in 2004, an investment that has grown to about $30 billion.
VMware’s core technology dominates the market for server virtualization. About five years ago, the company’s astronomical growth hit its first speed bump, and it was around that time that the company first began to think about expanding beyond its core server virtualization market.
Cheap server hardware had spurred the company’s meteoric rise. It had become so cheap to “rack and stack” new servers to solve performance problems that server complexity had grown out of control. VMware’s ability to make one machine work as many and then cluster those machines together simplified server management – and turned servers into a commodity.
But Neela Jacques, senior cloud strategist for VMware’s vCloud Suite, notes that servers are only a small part of data center complexity – in fact, about 5% of the problem. Network and storage growth and complexity are a much tougher nut to crack. And with workloads doubling every three years, organizations don’t have a lot of options. They can hire admins and scale, automate with a tangle of scripts, hire a consultant or management vendor – or they can automate based on policies, the software-defined data center approach.
The rise of Amazon EC2 was an inspiration for private clouds and virtual data centers, Jacques notes. Companies starting from scratch were turning their IT infrastructures over to Amazon rather than doing it themselves, but in doing so they risked being locked in to cloud vendors. That startups were willing to turn their IT infrastructures over to a third party rather than dealing with the complexity themselves was a wakeup call for established IT vendors.
VMware’s goal is to simplify the underlying infrastructure, not just mask the complexity, says Jacques.
The company’s vCloud Suite has a number of components for carrying out that vision:
vSphere: Virtualized infrastructure with policy-based automation.
vCloud Director: Virtualized datacenters with multi-tenancy and public cloud extensibility.
vCloud Connector: Integrated viewing and dynamic transfer of workloads between private and public clouds.
vCloud Networking and Security: Software defined networking, security, and ecosystem integration.
vCenter Site Recovery Manager: Automated disaster recovery planning, testing, and execution.
vCenter Operations Management Suite: Integrated, proactive performance, capacity, and configuration management for dynamic cloud environments.
vFabric Application Director: Multi-tier application service catalog publishing and provisioning.
vCloud Automation Center: Self-service and policy-enabled cloud service provisioning.
The virtual data center revolution is in the early stages. By one estimate, only about 5% of Fortune 500 companies are anywhere near fully virtualizing their data centers. Jacques likens it to the second or third inning of a baseball game, while server virtualization is in the eighth inning.
But while VMware’s ambition is to offer a comprehensive data center automation suite, Jacques says the company welcomes competition, both from other vendors and the open source community. “We try to enable choice everywhere we can,” he said. “Competition and openness are good.”
OpenStack experimenters are good in the company’s eyes, because they might opt for help after trying to implement a virtual data center infrastructure themselves. And Jacques notes that top officials from the company’s recent Nicira acquisition have been deeply involved in OpenFlow and OpenStack. VMware wrote extensions so that vSphere can run in OpenStack, an open source cloud infrastructure project that pools server, storage and networking resources and has the backing of more than 200 IT vendors. And VMware has long resisted pressure to limit interoperability, be it from server vendors or from parent company EMC.
“I don’t expect companies to believe what we’re saying, but watch what we do,” Jacques said.
Citrix and the Jevons Paradox
Citrix is another company vying for early leadership in the software-defined data center market.
The company has stitched together a virtual data center ecosystem that includes its Xen virtualization offerings, its NetScaler cloud network platform, open source and standards projects like OpenFlow, CloudStack and OpenDaylight, and partnerships with the likes of NetApp and others.
Still, Peder Ulander, vice president for Open Source Solutions at Citrix, thinks the automation revolution will never be fully complete. “I don’t think we ever get to the point where everything is 100% abstracted and automated,” he said.
That said, organizations are already seeing big gains from their data center virtualization efforts. Ulander sees users gaining more control without IT staff assistance, with IT services delivered on demand to users via self-service portals. With Citrix vApps, for example, users can provision a LAMP stack in minutes, saving weeks on simple application delivery. With software-defined networks (SDN), admins will be saved from mundane tasks like provisioning VLANs, he says.
Ulander notes that Citrix customers like Zynga and Nokia are seeing a 10-fold return on their virtual data center investments. “Not only does it pay for itself, but you’re driving creativity and innovation,” he said. “Enterprises that don’t look at doing this are throwing money away.”
Oddly enough, Ulander doesn’t see cost savings as a big benefit of software-defined data centers. He cites the Jevons paradox, a nearly 150-year-old economic theory that arose from an observation about the relationship between coal efficiency and consumption. Making a resource easier to use leads to greater consumption, not less, says Ulander. As users can do more for themselves and don’t have to wait for IT, they do more, so more gets used.
“The real gain is in the agility of the business,” said Ulander. “A self-service model is where the future is headed.”
“We have a long way to go before this becomes mainstream, but I believe it’s going to be transformative,” said Ulander. “It’s exciting to be part of this and watch it grow. We’re very much playing at the core of this.”
Red Hat and the open source data center
Red Hat, not surprisingly, is taking an open source approach to virtual data centers.
“We want to get to an open hybrid cloud,” said Radhesh Balakrishnan, Red Hat’s general manager of virtualization. “We fully embrace software-defined anything.”
The company’s goal is “stateless apps” – if a piece of underlying hardware fails, it doesn’t matter to the end user, who doesn’t even notice the failure. Another goal is a common fabric for storage and networking. “Single pane of glass management is our goal,” said Balakrishnan.
At the heart of Red Hat’s approach is the KVM hypervisor, Red Hat Enterprise Virtualization, Red Hat storage, ManageIQ, CloudForms and OpenStack.
With the company’s foundation of stable, hardened versions of open source software, Red Hat claims a cost advantage over VMware. “For the price of virtualization, you can get cloud,” said Balakrishnan.
He considers OpenStack a competitor to VMware’s vCloud, and the “default infrastructure platform” for everything above the fabric in private clouds.
“If you want to get a meeting with a CIO, the best way is to say you want to have a conversation about OpenStack,” said Balakrishnan. He expects to see OpenStack production deployments take off in the next 12-18 months. “The integration layer is maturing very fast,” he said.
He sees cost reduction, agility and future-proofing as the biggest benefits of a virtual data center.
“It will take some of the complexity and cost out of hardware, but you won’t get to 0/100,” he said.
Implications of virtual data centers
As with any big technological change, the biggest question surrounding software-defined data centers is what the future will look like. Will admin jobs vanish? Will users spin up IT services by themselves, with no help from IT staffers? Will all hardware become a commodity?
The truth is likely to reside somewhere in the middle. VMware’s Jacques predicts that more and more basic tasks will get automated, but complex new tasks will arise that require IT expertise. He cites Big Data as one emerging new area of technological complexity. And organizations will always need someone to swap out and maintain the underlying hardware.
Complexity is a “moving target,” Jacques said. “Will IT become simpler? Yes, but at the same time, other things will become more complex.”
He doesn’t see all hardware becoming “dumb,” with the intelligence residing in the data center management layer. Native hardware intelligence will likely always have a role, and it’s up to management software vendors like VMware, Citrix and Red Hat to put it to use. And there will still be room for higher-margin systems, as Cisco has shown with its integrated Unified Computing System (UCS).
But there’s no question that many things will become simpler. Just as an operating system can mask underlying complexity, so too will the new generation of software-defined data centers. And IT roles will certainly change as a result. Admins will still be needed, but might become something of a commodity themselves, while Big Data analysts and other as yet unforeseen roles will become the new sought-after careers. And a new generation of IT vendors may emerge to lead the way.
For one company’s pioneering experience with software-defined data centers, see MicroStrategy Reaps Virtualization’s Benefits with Software-Defined Data Center.
Paul Shread is editor in chief of the IT Business Edge network.