Theres been at least as much healthy skepticism about cloud computing as there has been optimism and real results. And there ought to be, especially as cloud computing moves out of buzzword territory and becomes an increasingly powerful tool for extending IT resources.
To that end, heres a rundown of ten key things both creators and users of cloud computing should continue to bear in mind.
The good news is that the very nature of the cloud may be compelling more real thought about security on every level than before. The bad news is that a poorly written application can be just as insecure in the cloud, maybe even more so.
Cloud architectures dont automatically grant security compliance for the end-user data or apps on them, and so apps written for the cloud always have to be secure on their own terms. Some of the responsibility for this does fall to cloud vendors, but the lions share of it is still in the lap of the application designer.
A cloud computing-based solution shouldnt become just another passive utility like the phone system, where the owners simply puts a tollbooth on it and charges more and more while providing less and less. In short, dont give competitors a chance to do an end run around you because youve locked yourself into what seems like the best way to use the cloud, and given yourself no good exit strategy. Cloud computing is constantly evolving. Getting your solution in place simply means your process of monitoring and improving can now begin.
Were probably past the days when people thought clouds were just big server clusters, but that doesnt mean were free of ignorance about the cloud moving forward. There are all too many misunderstandings about how public and private clouds (or conventional datacenters and cloud infrastructures) do and dont work together, misunderstandings about how easy it is to move from one kind of infrastructure to another, how virtualization and cloud computing do and dont overlap, and so on.
A good way to combat this is to present customers with real-world examples of whats possible and why, so they can base their understanding on actual work thats been done and not just hypotheticals where theyre left to fill in the blanks themselves.
Cloud infrastructures, like a lot of other IT innovations, dont always happen as top-down decrees. They may happen from the bottom up, in a back room somewhere, or on an employees own time from his own PC.
Examples of this abound: consider a New York Times staffers experience with desktop cloud computing. Make a sandbox space within your organization for precisely this kind of experimentation, albeit with proper standards of conduct (e.g., not using live data that might be proprietary as a safety measure). You never know how itll pay off.
The biggest example of this: Amazon EC2. As convenient as it is to develop for the cloud using EC2 as one of the most common types of deployments, its also something to be cautious of. Ad-hoc standards are a two-edged sword.
On the plus side, they bootstrap adoption: look how quickly a whole culture of cloud computing has sprung up around EC2. On the minus side, it means that much less space for innovators to create something open, to let things break away from the ad-hoc standards and can be adopted on their own. (Will the Kindle still be around in ten years?) Always be mindful of how the standards youre using now can be expanded or abandoned.