Everybody is talking about “cloud computing,” the latest and most problematic major buzzword to plague IT jargon. Here’s why I think we should all just stop using it.
“Cloud computing” is confusing. Just about everyone in technology seems to like throwing around “cloud computing,” but either don’t know, or don’t agree on, what it means.
“Cloud computing” has been used to mean grid computing, utility computing, software as a service, Internet-based applications, autonomic computing, peer-to-peer computing and remote processing. When most people use the term, they may have one of these ideas in mind, but the listener might be thinking about something else.
“Cloud computing” is misleading. As a marketing buzzword, it’s used to suggest that something new and better is going on, when in fact there may be nothing new about it. Yes, new technologies arise all the time. But assimilating new technologies into the “cloud computing” label doesn’t make all the old stuff under that umbrella new. It also “clouds” communication between technical people and non-technical people. The former usually understand that “cloud computing” is a largely meaningless catch-all phrase to describe just about everything that’s happening online nowadays. But the latter, I’ve found, tend to assume it’s something more specific than that.
“Cloud computing” is redundant. Unlike other buzzwords such as, say “Web 2.0” or “virtualization,” the word wasn’t coined out of necessity to describe something new, or to bring clarity to something vague. Just the opposite. It was coined to put a new coat of paint on something old, and to add vagueness to specific, well understood technologies.
“Cloud computing” is dangerous. It’s useful only as a buzzword, as marketing pixie dust to sprinkle on whatever it is they’re selling in order to override the application of good sense.
The phrase “cloud computing” originates in a common symbol — a cartoonish cloud outline — used in network diagrams to represent processes that are either too complex to describe, or systems managed by others. It represents a “black box” in which things happen beyond our understanding or control.
In other words, the “cloud” in “cloud computing” represents ignorance. And this ignorance is touted as one of the benefits of “cloud computing.” When companies hawk “cloud computing,” they’re selling the idea that ignorance is bliss. Don’t worry your pretty little head about it. We’ll take care of everything.
As companies get excited about, grow comfortable with and ultimately embrace what they think is the shiny new world of “cloud computing,” everything becomes less reliable. A single Web page, for example, might be built out of several “cloud” components — one company providing storage, another applications and still another site metering. If anything breaks, everything breaks. Cloud computing simply increases the number of things that can go wrong.
And go wrong they do. In the past few weeks, GoToMeeting, Amazon’s EC2 and S3, SiteMeter, Gmail, Netflix and MobileMe each experienced significant outages.
If you think outages of Internet-based services are increasing in both frequency and duration, you’re not crazy. The ever-increasing complexity of these multiple black-box components increases the chances that something will go south.
I recently tried to move everything from my PCs to the “cloud” (OK, the Internet). Gmail online, Google Docs for recent stories and other content, Plaxo, Jott, Google Calendar and a whole bunch of other stuff I use as replacement for desktop versions. On a recent business trip, I found myself on a flight with literally zero access to any current work, and nothing to do but look out the window — at real clouds. This is the worst-case scenario for “cloud computing” — zero access, and there’s nothing you can do about it.
(Of course, that lack of access was my own fault for taking the online app thing too far, and for not providing myself with redundant copies of everything on my laptop. My main beef is with the phrase, not the misapplication of, “cloud computing.”)
There are technologists and analysts who are earnestly toiling away to give “cloud computing” meaning, and to make it a usable concept. I admire their spunk. And I’m sure I’ll get hate mail from some of them telling me that, no, “cloud computing” is a meaningful and useful term. It doesn’t matter if these “cloud computing” experts disagree with me. What matters is that they in fact disagree with each other. And that’s part of the problem.
The “cloud computing” buzzword has got to go. It’s simply too confusing, misleading, redundant and dangerous.
Ethics and Artificial Intelligence: Driving Greater Equality
FEATURE | By James Maguire,
December 16, 2020
AI vs. Machine Learning vs. Deep Learning
FEATURE | By Cynthia Harvey,
December 11, 2020
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2021
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.