The idea of virtual data centers, where the vast infrastructure of storage, servers and networks can be managed as one, has been around since IBM first virtualized the mainframe nearly 50 years ago. It's taken decades to achieve, but a few companies today may be close to matching IBM's feat from the mainframe era in modern data centers.
IBM's pioneering work in mainframe virtualization was an inspiration for VMware's launch many years later. And just as IBM virtualized what was then the entire computing environment – the mainframe – so today companies like VMware, Citrix and Red Hat are trying to do the same thing across the distributed data center infrastructure of servers, storage and networks.
The pioneering work by IBM engineers Robert Creasy and Les Comeau in 1964 gave each mainframe user a virtual machine (VM), a self-contained operating environment that allowed many applications to run simultaneously in the same mainframe. Fast forward to today, where a new generation of upstarts is attempting to do the same thing with modern distributed data centers by separating the operating system and application layers from the entire underlying physical infrastructure.
In effect, the new software-defined data center management layer becomes something like a PC operating system for the entire data center, masking if not simplifying the underlying physical infrastructure while making sure that applications get the resources they need to run.
As computing shifted from mainframe to distributed environments, the virtualization problem became more complex, but the solution is essentially the same. It took decades to evolve, from job scheduling and root-cause analysis to run-book automation and provisioning, configuration and change management, but a few companies are close to assembling the equivalent virtualization technology in today's data centers.
Taken to its logical extreme, next-generation data centers could pool all data center hardware and let applications run according to policies like importance and timeliness and security, running applications on the most appropriate physical resource regardless of where it's located. Storage and network hardware could become as commoditized as servers have been for some time. Self-service IT – where users essentially launch their own apps and services at the click of a mouse – could become a reality. Specialized jobs like storage, network and server admins could disappear or become less important if everything can be run from a single console.
The truth will likely be somewhat less than the vision, as it always is. Technology gets more complex even as users wish for simplicity, new technologies arise over time, and there are always unintended and unforeseen consequences. But IT departments could be about to change dramatically as a result of the rise of the software-defined data center.
VMware's virtual vision
Perhaps the company most in the lead in the software-defined data center gold rush is VMware. The 15-year-old company was acquired by storage giant EMC for $625 million in 2004, an investment that has grown to about $30 billion.
VMware's core technology dominates the market for server virtualization. About five years ago, the company's astronomical growth hit its first speed bump, and it was around that time that the company first began to think about expanding beyond its core server virtualization market.
Cheap server hardware had spurred the company's meteoric rise. It had become so cheap to "rack and stack" new servers to solve performance problems that server complexity had grown out of control. VMware's ability to make one machine work as many and then cluster those machines together simplified server management – and turned servers into a commodity.
But Neela Jacques, senior cloud strategist for VMware's vCloud Suite, notes that servers are only a small part of data center complexity – in fact, about 5% of the problem. Network and storage growth and complexity are a much tougher nut to crack. And with workloads doubling every three years, organizations don't have a lot of options. They can hire admins and scale, automate with a tangle of scripts, hire a consultant or management vendor – or they can automate based on policies, the software-defined data center approach.
The rise of Amazon EC2 was an inspiration for private clouds and virtual data centers, Jacques notes. Companies starting from scratch were turning their IT infrastructures over to Amazon rather than doing it themselves, but in doing so they risked being locked in to cloud vendors. That startups were willing to turn their IT infrastructures over to a third party rather than dealing with the complexity themselves was a wakeup call for established IT vendors.
VMware's goal is to simplify the underlying infrastructure, not just mask the complexity, says Jacques.
The company's vCloud Suite has a number of components for carrying out that vision:
vSphere: Virtualized infrastructure with policy-based automation.
vCloud Director: Virtualized datacenters with multi-tenancy and public cloud extensibility.
vCloud Connector: Integrated viewing and dynamic transfer of workloads between private and public clouds.
vCloud Networking and Security: Software defined networking, security, and ecosystem integration.
vCenter Site Recovery Manager: Automated disaster recovery planning, testing, and execution.
vCenter Operations Management Suite: Integrated, proactive performance, capacity, and configuration management for dynamic cloud environments.
vFabric Application Director: Multi-tier application service catalog publishing and provisioning.
vCloud Automation Center: Self-service and policy-enabled cloud service provisioning.
The virtual data center revolution is in the early stages. By one estimate, only about 5% of Fortune 500 companies are anywhere near fully virtualizing their data centers. Jacques likens it to the second or third inning of a baseball game, while server virtualization is in the eighth inning.
But while VMware's ambition is to offer a comprehensive data center automation suite, Jacques says the company welcomes competition, both from other vendors and the open source community. "We try to enable choice everywhere we can," he said. "Competition and openness are good."
OpenStack experimenters are good in the company's eyes, because they might opt for help after trying to implement a virtual data center infrastructure themselves. And Jacques notes that top officials from the company's recent Nicira acquisition have been deeply involved in OpenFlow and OpenStack. VMware wrote extensions so that vSphere can run in OpenStack, an open source cloud infrastructure project that pools server, storage and networking resources and has the backing of more than 200 IT vendors. And VMware has long resisted pressure to limit interoperability, be it from server vendors or from parent company EMC.
"I don't expect companies to believe what we're saying, but watch what we do," Jacques said.
Citrix and the Jevons Paradox
Citrix is another company vying for early leadership in the software-defined data center market.
The company has stitched together a virtual data center ecosystem that includes its Xen virtualization offerings, its NetScaler cloud network platform, open source and standards projects like OpenFlow, CloudStack and OpenDaylight, and partnerships with the likes of NetApp and others.