Datamation content and product recommendations are
editorially independent. We may make money when you click on links
to our partners.
Learn More
On-demand computing has moved to the center of the spotlight in recent months due in no small part to vendors, such as IBM and Computer Associates, showering it in hype.
The complete vision of on-demand computing presents a scenario, proponents say, in which computer power is available as easily as electrical power. Sounds great, at least in theory, and it well may be once it becomes reality. But if examined closely, there are significant issues at the present time that will likely delay widespread deployment and thus should not be underestimated.
- The technology required to support an on-demand approach isn’t yet available and likely will not be in the short term.
- Even when it is ready, IT managers and their bosses may be reluctant to trust mission-critical applications to servers that are offsite and ultimately controlled by another company.
- IT managers must be careful to employ whatever version of on-demand computing emerges in a manner that actually reduces enterprise costs. This seems simple but, according to one analyst, will require discipline.
This isn’t to say that IT folks don’t like the idea of on-demand computing. “I’m enthusiastic about it,” says Dana Blankenhorn, a business senior analyst for Progressive Strategies. “This is where we’re going. The question is, how we get there, and the bumps along the way.”
The first issue is the general approach. While raw bandwidth itself is inexpensive, the interfaces, redundancy, and general “bullet-proof” requirements necessary to perform the computing tasks in a centralized — perhaps distant — place may be more expensive than performing the computing tasks onsite.
The next consideration is the hearts and minds of senior management. Even assuming that an on-demand architecture is financially plausible, many in senior management may not think it is safe, says Paul Antturi, the manager of information systems for McKesson Medical Imaging Group, a company located in Vancouver, Canada. “I don’t see it as a technical issue,” he says. “But it would be a significant hindrance for corporate leaders to trust information to a third party.”
The third — and perhaps most formidable — hurdle involves the underlying software. The term “on-demand” suggests that compute cycles are delegated to applications on an as-needed basis. However, software currently doesn’t load balance between applications on the fly, Antturi says. An e-mail server runs e-mail. During lulls, like at 4 a.m., the e-mail server sits idle or operates at reduced capacity. Cycles can’t be dedicated to an inventory application running at full tilt.
“Unfortunately, today’s software design does not allow for load balancing of all applications across all available servers. To process e-mail I must use the e-mail server. I cannot send small process segments to any available server for processing,” says Antturi. “Current software methodology cannot and will not lead to utility computing … Hardware is capable of utility computing today, just as it’s capable of grid computing. Software design is the limitation.”
Thus, “on-demand” computing may consist of traditional arrangements in which single-purpose servers sit in a data center and handle a variety of applications. When that application is not running or is running at less than capacity, all or part of the server is idle. The only thing truly different is the ownership of the hardware and the payment method. The vendor may send over extra servers on an on-demand basis — the enterprise calls and more machines are dispatched. In this way, an “on-demand” relationship between vendor and customer exists. But this clearly isn’t the futuristic system vendors are portraying.
The ultimate, and much more mundane and practical, drawback to on-demand computing is human nature.
“What I see as the biggest potential downside or challenge is the law of unintended consequences,” says Lance Travis, the vice president of research for AMR Research. “If you begin to treat this unlimited capacity as a free resource you could be doing the wrong thing.”
Off-loading to a third party may encourage the business to waste resources. Travis uses e-mail as an example. Enterprises running their own e-mail infrastructures generally make employees get rid of unnecessary messages on a regular basis to keep storage requirements down. If e-mail storage is available from a third party, however, this “nagging” from the IS organization may subside. The amount of storage bought on demand could, in this scenario, tack up to staggering proportions.
The key is to make sure the enterprise doesn’t act any differently once the new infrastructure is established. “What I counsel them to do is begin thinking of IT as an organization sitting between your users and the service providers,” Travis says. “You need to have policies and SLAs based on your strategic requirements as a company. You need to map them into services you buy from an on-demand supplier, as opposed to buying what they offer without a thought or a strategy.”
-
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
-
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
-
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
-
Top 10 AIOps Companies
FEATURE | By Samuel Greengard,
November 05, 2020
-
What is Text Analysis?
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
-
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
-
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
-
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
-
Top 10 Chatbot Platforms
FEATURE | By Cynthia Harvey,
October 07, 2020
-
Finding a Career Path in AI
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
-
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
-
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
-
Top 10 Machine Learning Companies 2020
FEATURE | By Cynthia Harvey,
September 22, 2020
-
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
-
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
-
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
-
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
-
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
-
Anticipating The Coming Wave Of AI Enhanced PCs
FEATURE | By Rob Enderle,
September 05, 2020
-
The Critical Nature Of IBM’s NLP (Natural Language Processing) Effort
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
August 14, 2020
SEE ALL
ARTICLES