Everyone, it seems, is heading for the cloud at a breakneck speed. But in the headlong rush, questions are starting to crop up about how your business stays in control.
It may not be enough to take your protein pills and put your helmet on, as the lyrics suggested in the David Bowie song “Space Oddity.” So what can you do?
First, let’s understand what all the hurry is about. Why are firms adopting cloud computing?
Randy Mott, CIO of HP, has said that within three years, 55% of HP IT applications will on the cloud. He laid out a multitude of problems with having everything in-house – in particular, having multiple apps all doing the same thing, and spending too much time on software maintenance and not enough on innovation.
Others are looking to the cloud to reduce costs and increase flexibility. Jay Kerley, Deputy CIO of Applied Materials, for instance, has been overseeing a project to create a desktop cloud for the company’s many Computer Aided Design (CAD) users. His principle drivers were being able to reduce the desktop budget considerably by eliminating the need for an expensive workstation under every desk, while at the same time providing the workforce with far more mobility options.
All the while, these execs are being egged on by a constant wave of hype surrounding cloud computing. As well as a barrage of marketing by vendors, analyst firms and thought leaders who are echoing the clarion call.
Michael Schrage, research fellow at MIT Sloan School’s Center for Digital Business, for example, went so far as to proclaim the cloud as “the greatest medium for rapid multi-modal experimentation and test in the history for the world.”
In his view, IT needs to adopt the cloud, or the business world will bypass IT and go direct to cloud providers for more and more projects. Already, he said, this trend is underway. The question: will IT adapt or will it be relegated to a mere plumbing role?
Your Circuit’s Dead, There’s Something Wrong
But hype is one thing and reality quite another. Schrage raised various questions about the cloud, such as: how do you benchmark against it? This is something that IT better figure out fast.
Others are looking at even more serious issues. Miki Sandorfi, chief strategist at Hitachi Data Systems (HDS), brings up several problems with the cloud, including outages, security and more.
“Some high availability numbers from cloud providers have been exposed as hype as the wrong data was restored after an outage,” he said.
That became all the more evident with the recent Amazon Elastic Compute Cloud (EC2) outage that put dozens of websites and social networks out of commission. Amazon’s promise of 99.95% availability came to nothing, with some sites being down for more than a day.
So it’s no surprise that we are now seeing some storage architectures that tout more control, safeguards and security for those journeying to the cloud. HDS, for instance, has developed an approach to the cloud that it basically characterizes as an integrated edge-to-core solution.
The idea is to be able to move into or out of the cloud while minimizing risk. It consists of the Hitachi Data Ingestor (HDI) at the edge, and the Hitachi Content Platform (HCP) at the core.
HDI operates as the on ramp, said Sandorfi, while HCP’s 40 PB of capacity includes data protection, replication, deduplication, compression, encryption and retention. When deduplication and compression are factored in, that’s an awful lot of room for expansion.
“Whether an organization is looking to design a private cloud environment to better serve its internal business users or a service provider looking to build an infrastructure to sell their own cloud services,” Sandorfi said, “they need the ability to segregate data securely and manage varying data sets and workloads according to their individual requirements, all while capturing and measuring consumption and feature use at a granular level for reporting and/or chargeback.”
He added that the HDS cloud products work with any disk arrays such as those from rivals such as NetApp and EMC.
Planet Earth is Blue and There’s Nothing I Can Do
One big problem with the cloud is that once you are tempted to dump a ton of data into it, you can be in trouble if you need to get the data back again in a hurry.
That is one of the issues that concerned Jeff Rountree, Global Network Manager for Pump Solutions Group (PSG). He has implemented the cloud in his company using AT&T as the cloud service provider, backed up by a Whitewater Cloud Storage Accelerator by Riverbed.
With a company network that spans the USA, China, France, Germany and India, Rountree explained that there was a need for more and more storage. Before data is sent up to the cloud, it is encrypted within the Riverbed appliance.
“Whitewater Accelerators optimize and deduplicate data so that keeps my costs down in a pay-as-you-go cloud model,” said Rountree. “I end up paying for 10 GB instead of a 100 GB.”
Client devices send data to a backup data, which forwards it to the Accelerator and from there to the cloud. This has resulted in backups being cut in half and the elimination of having to stage backups on disk before sending them to tape. A copy of all data is retained onsite within the Riverbed appliance. As an additional safeguard, the cloud provider supplies both local and remote replication.
“The basics still matter though, such as making sure your backups are correct,” said Rountree. “If you put a bad backup up in the cloud, you end up with multiple copies of bad data.”
Cloud storage management is provided by the Whitewater Accelerator. Rountree said he plugged it in, gave it an IP address and then mapped the storage devices to the backup target. Sending tapes offsite in trucks to an outside archive provider is no longer required.
“The benefits have been no more tape restores, a lot more flexibility in disaster recovery and about two hours a day less of administrative overhead,” said Rountree.
Commencing Countdown Engines On
Deduplicating the data before sending data to the cloud keeps the volume down. But what about speeding up transfer rate so you can get it in and out with rapidity?
That challenge is addressed by StorSimple. Its StorSimple 5000 hybrid cloud appliance utilizes solid state disk (SSD) to add speed. It also comes with built-in deduplication to reduce disk consumption and an interface to manage the connection to online storage providers such as Amazon, AT&T and Zetta.
This setup enables vital data to be retained in the appliance while everything else is moved off to the cloud. The system architecture enables high throughput for data sent onto and pulled out of the cloud.
“The StorSimple 5000 eliminates the nightmare of having to archive aging data to tape,” said Ian Howells, chief marketing officer of StorSimple. “90 percent of active data can be read from within an SSD.”