We’ve had a rather inconsistent view of exactly where computing power should be kept, and it just began another huge shift with the advent of 5G wireless technology. We started out with mainframes where the computing power was centralized, then we moved to PCs where it was highly decentralized. Then we moved to client/server computing which was, admittedly, kind of a complex mess, and then we moved the Cloud Computing, where computing power is centralized on a massive scale.
Well, I’ve now seen presentations from a variety of vendors talking about 5G and one of the most consistent messages is that these, largely brand new, multi-billion-dollar, mega-datacenters are soon to be obsolete. This is because 5G will demand data again be decentralized. (Coupled with the concept of edge computing, this isn’t that far removed from the old PC-centric model.)
Protecting your company’s data is critical. Cloud storage with automated backup is scalable, flexible and provides peace of mind. Cobalt Iron’s enterprise-grade backup and recovery solution is known for its hands-free automation and reliability, at a lower cost. Cloud backup that just works.
Confused? You betcha, but let’s see if we can dig out of this mess.
The issue with 5G is that is promises impressively low latency and impressively high data rates. Roll out is aggressive but the existing network that the 5G devices will be connected to was not designed for the massive increase in bidirectional data traffic. Or to assure the low latency promise that 5G provides.
As a result, data centers will again have to be decentralized so performance won’t get bottlenecked – and the promised data performance improvements can be realized.
What I find interesting is that this move is coming from companies like HP, IBM, and AT&T not companies like Amazon, Google, or Facebook making me wonder if someone missed a meeting. This suggests a coming issue with some of these mega-cloud providers and a potential benefit for some of the challengers like IBM and Dell’s Virtustream. This player could, because they are smaller, move on this change and then provide far better service to 5G customers than the industry’s giants.
Edge Computing adds another dynamic because, as presented by Qualcomm (and others), it suggests that some of the processing will be done locally to reduce the traffic flowing over the network in the first-place. This offsets the problem in a different way.
For instance, if a security camera can do some compression, pre-processing, or some initial detection, that meta-data could be sent along with the image lowering processing requirements on the central resource. This could potentially increasing reaction times without significantly increasing data.
NVIDIA’s latest graphics technology could provide another interesting opportunity that I haven’t yet seen addressed. Their new Turing architecture gets to its impressive performance level by splitting rendering into two phases. First, an initial Ray Tracing effort, and then an upscaling effort to get to impressive real time 4K resolutions with a far lower power envelope.
But what if you just did that up conversion at the point where you rendered the image and just sent the low-resolution initial image with meta-data to the cloud service? You’d potentially lower the amount of data you were sending massively while maintaining a low latency 4K experience.
Combining these two concepts, you should be able to massively improve the experience of a 5G device user, while largely leaving the existing network in place. And if you dynamically managed the loading between centralized datacenters and the new distributed data centers you will still need to build, you should be able to make both the best use of existing resources and grow your distributed capacity in-line with 5G deployments.
This is really a shift to a much more complex model than we’ve ever had before, and management tools will need to evolve to better deal with it. The combined result should create a kind of meshed intelligence that could be more broadly shared for big projects. Like Boinc but massively more powerful and dynamic.
5G will result in a blended distributed computing model with increased complexity but far higher potential performance. This will require that we rethink how we build our networks, where we put our computing resources, and how we manage the result.
It will likely require a massive shift of existing resourses mitigated somewhat by improved dynamic load balancing tools. Those cloud providers that see this coming stand to gain significant market share at the expense of those that don’t.
But we aren’t done yet. Quantum Computing is coming, and it is such a massive increase in performance and so massively costly at scale that we will likely shift, at least for those needing this resource, back to a highly centralized model again. But that will also require a massive upgrade to existing networks.
I guess this is a long way of saying the tech market is about to become a lot more lucrative – but only for those that understand and can prepare for this change. Good luck!
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2020
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Anticipating The Coming Wave Of AI Enhanced PCs
FEATURE | By Rob Enderle,
September 05, 2020
The Critical Nature Of IBM’s NLP (Natural Language Processing) Effort
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
August 14, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.