Obviously, data centers will still exist in some form, but will these soon be confined to large service providers and Fortune 500 companies?
According to a recent report recent report from the University of California at Berkeley, cloud computing services are already five to seven times more cost effective than traditional data center ones.
Even if you’re a cloud skeptic, don’t be surprised if someone from the business side of your organization starts asking about clouds because of studies like this one. In a deep recession, the lowest common denominator is cost. Any technology, proven or not, that promises to cut costs will get attention – whether IT likes it or not.
However, that doesn’t mean you need to start migrating your applications to Google or Amazon right away. What it does mean, though, is that you need to have a strategy you can articulate, even if that strategy is simply to wait and see.
“You may see some applications, low-criticality applications like email, being outsourced, but any real impact cloud computing will have on data centers is several years off,” said Nik Simpson an analyst with the Burton Group.
A slew of obstacles prevent enterprises from moving critical applications to the cloud today, including a lack of SLAs and data-privacy and compliance issues. There’s also the fact that most cloud computing vendors don’t run the same applications that corporations do.
“Either your software vendors have to rewrite their applications or you have to rewrite your in-house applications. There are quite a lot of barriers to that at the moment,” Simpson said. “The shortest path to the cloud for most organizations will be through virtual machines, where you can take an existing virtual machine and simply push it to the cloud.”
Even moving virtual machines dredges up several issues, including bandwidth, latency and availability. “I think enterprises will ultimately move a significant portion of their applications suites – or equivalent applications – into the cloud, but it’s a long ways off,” he added.
The Elephant in the Room: Google
Google has been using the cloud as one of its strategies to poach in areas that Microsoft has traditionally owned. Gmail and Google Apps are quietly pecking away at Microsoft Office, and no one seems to mind – or even notice – that these are cloud-based applications.
Of course, these are consumer, not enterprise, applications, but momentum is momentum.
“Unquestionably, there will be fewer data centers operated by corporations in the future,” said Rajen Sheth, senior product manager for Google Apps.
“There are very few companies out there where it’s really a core part of their business to operate a data center.”
For instance, it makes more sense for an insurance company to focus on writing and selling policies, rather than managing and maintaining email.
Sheth agrees with Simpson that the easy stuff, like email, will move to the cloud first. However, Sheth argues that the underlying platforms that allow for broader data migration are already maturing.
“As more platforms and infrastructure-level technologies become available, like Google Apps Engine and Amazon Web Services, even custom applications will move to the cloud.”
Déjà vu All Over Again
Haven’t we been down this path before? In the late 90s and early 2000s, one of the most hyped technology sectors was the MSP/ASP space.
Although a few of those startups have survived, most did not. And MSP/ASPs did not shake up the IT landscape as some thought it would.
What’s different this time?
“The cynical side of me says that there is no difference,” Simpson said. “I’m seeing all of these cloud storage providers making the same claims that storage service providers made in 2000 and 2001.”
According to Simpson, organizations are attracted to outsourcing intellectually. However, once they factor in things like compliance, bandwidth costs and increased latency, the intellectual foundation starts to crumble.
“The ASP companies who survived turned to things like hosting Microsoft Exchange,” Simpson added. “There are some applications that lend themselves to that model, but there are even more that don’t.”
Simpson gave the instance of databases. At first glance, they seem like a logical thing to move to the cloud. “Yes, they’re all running SQL Server, but they’re all totally different in terms of the underlying database design and the tables. It’s much more difficult to create a standardized database server instance that you can rent out in the same way you can with Exchange,” he said.
Google, for its part, is not blind to these issues. “The data center isn’t going to fully go away,” Sheth admitted. “There are always going to be applications that organizations do believe are part of their core businesses.”
These, Sheth believes, probably will stay internal, although he believes there will be fewer and fewer in-house corporate data centers as time goes on.
The Need for New IT models
Despite reports claiming great costs savings when applications are moved to the cloud, that’s not always the case.
Virtualizing applications as they are today and moving them to the cloud probably won’t make sense. Costs won’t go down. You’ll still have to maintain those applications and the underlying operating systems, while keeping up with patches.
“It’s a mistake to take a model designed for a private data center and move it to the cloud,” Sheth said. “Moving to the cloud gives you the opportunity to do things differently and gain massive efficiencies.”
He pointed to Gmail, which was designed much differently than traditional email. “We designed it to run over thousands of systems and maintain the mail of tens of millions of users. We designed it with a multi-tenant architecture. As a result, we can do a variety of things to make it more efficient.”
Sheth claims that for the enterprise, Google can bring down the typical cost of email from $250-$300/user/year to $50/user/year.
What Do the Traditional Data Center Vendors Think?
Many traditional data center vendors have a cloud strategy. But whether it’s just a marketing strategy or a real technical one is a different matter.
When I spoke to Brocade CTO Dave Stevens, I expected to hear a lot of warnings about the risks of cloud computing. Stevens surprised me, saying that the cloud-computing model as it applies to the enterprise is truly revolutionary.
“In terms of infrastructure, however, it’s evolutionary,” Stevens pointed out. “You don’t need a new switch or a new disk drive to do any of this.
“As a technology, cloud computing has been around for quite some time, as has virtualization. You’ve been able to virtualize network infrastructure for ten years. It’s not hard to take a router and cut it up into multiple virtual routers. You can do the same thing with switches and firewalls. There’s an awful lot of virtualization technology that is standard today.”
The trick, though, is moving from plumbing up the stack. Ideally, an IT organization would have an easy, intuitive software layer that allows it to reach into pools of infrastructure components and combine them, through software, into an IT-deployable system.
That’s easier said than done. Then when users access those resources, you have to bill them or their business unit for it. Again, not a simple proposition.
For public clouds, another huge obstacle is data movement. It’s one thing to move a few megabytes of data; it’s quite another to move a terabyte. Moving the application is the easy part. Moving huge chunks of data is much harder.
Stevens also brought up compliance. “If I’m moving data with medial information across a public WAN, how do I maintain HIPPA compliance? From an auditing perspective, I’m responsible for protecting that data. The cloud vendor isn’t.”
Google believes that the security, compliance and reliability issues are much less than worrisome than everyone is led to believe.
“We’ve been proving that we’re reliable at keeping services up, and if a service goes down, we’re transparent with users about what happened. We already offer 99.9% SLAs,” Sheth said. “Compare that to what enterprises are used to on their internal systems and we’re more reliable.”
Sheth noted that there are still problems to solve, but many of these are more behavioral than technical.
“In many ways the security model of the cloud is better than with internal data centers,” he said. “In the client-server model, a lot of sensitive information resides on the client. Patching is a nightmare, as well.
“In the cloud, it’s easier to centralize and protect data, and you have the ability to quickly address security vulnerabilities.”
Proving that protection to an auditor, though, is a different matter.
Turning Back to Today
How long will it take until vendors actually solve these compliance, latency, data migration and security issues?
“Until [software vendors] see demand from their customers to move major business-critical applications into the cloud, there’s no impetus for them to try to solve these problems,” Simpson said.
“It’s a question of timing. Do you want to spend a lot of time, energy and money today on a market that may emerge three years from now, especially when your competitors are waiting until 2012 to act?”
Everyone I talked to agreed that the biggest short-term benefits come with virtualization – and virtualization has the added benefit of paving the road for future innovations and efficiencies tomorrow.
“There are a lot of good options, today, to drive efficiencies: virtualization, high-density blade systems, high-performance switching platforms and high-density storage infrastructure,” Stevens said.
“If you take what you deployed seven or eight years ago and update it with available technologies, you can lower your data infrastructure costs by 30-50%.”