Monday, April 19, 2021

IBM And The Promise Of Quantum Hybrid Deep Learning AI

With IBM leading in Deep Learning AI technology at scale – and one of the most visible in Quantum computing research – a lot of us were wondering when we’d see a presentation from the company combining the two technologies. Well, this week that wondering ended as IBM briefed us on what appeared to be the beginning of a new hybrid computer. One that combines the power of Watson with the power of their Quantum effort to create something very new and different. 

What made this particularly interesting is that the two technologies are very different. Watson largely sprang out of a Neural Networking effort, partially focused on emulating the human brain and initially winning game shows as a showcase. So, at its heart, Watson is kind of an electronic computer emulating an organic computer. But Quantum technology is vastly different: it really didn’t even come from the technology market but from Physics Theory. And it not only has little in common with existing computers, it pretty much has nothing in common with organic computers –  meaning that combining the two technologies is monumentally difficult.  

But IBM has apparently figured out a path to this future. Let’s talk this week a bit about what that means. 

The Quantum Computer Superpower

We had a lot of issues when we moved from single core processors to multi-core solutions. The problem we had was that most programs were meant to execute sequentially, and this meant that when you put them on a typical multi-core computer you pegged one of the cores, leaving the rest idling unused. It took us awhile to figure out how to write and rewrite code so that it could be executed in parallel and performance jumped dramatically. 

The change to Quantum computing makes that arduous process look exceedingly simple in comparison, because Quantum computers deal with data far differently. They computing elements don’t even have the same states, which both allows for more flexibility and creates a huge problem with regard to writing optimized code, because few coders understand this difference. 

Even taking a simple program and converting would be problematic. And current thought is that you’d generally have to start from scratch with one of the handful of folks that might be able to write code for this platform. Ceating an application that was vastly different than anything that had been seen before. 

Now what motivates you to do this is that the processing potential for a Quantum Computer is massively higher than we have yet experienced. And potential performance growth rates make Moore’s law look frozen in place in comparison.  

This means taking a platform like Watson and converting it to run optimally on a Quantum computer will likely be beyond our skill set for the foreseeable future. But you could create a hybrid, though, and have the Quantum computer do a task that the AI doesn’t do well to allow the AI to perform more quickly. 

Think of it like a turbocharger for a car. A turbocharger is a compressor and more similar to a jet engine in design, but turbines and cars didn’t work. But tied into an engine they compress the air/fuel mixture and make the car far faster. Together they are better than separately, and that is similar to what I’m talking about here. If a Quantum computer can turbocharge Watson, the result should be a significant performance boost. 

The Quantum AI Turbocharger

One of the things IBM has discovered that Quantum computers do very well is structure unstructured data like images. They can, with an incredibly high degree of accuracy, sort highly complex objects.

Now Watson needs to be able to make decisions from unstructured data and traditional CPUs typically aren’t great, from performance standpoint, at structuring unstructured data. That is why we use GPUs instead for high performance efforts. Watson uses a lot of GPUs in its most advanced form, but Quantum computers are potentially far faster than a GPU. Thus, combining the two systems should result in vastly faster unstructured data analysis. 

This won’t obsolesce GPUs because they will still be needed in the decision-making process. But now they will be fed data at far higher speeds than they could have previously accepted and, much like compressing the charge in that high-performance engine, the result should be a vastly more capable result. 

Now there apparently needs to be an intermediate computer that bridges what the Quantum Computer structures and what the AI accepts. That intermediate computer is called a NISQ (noisy intermediate-scale quantum) computer. And it creates something like a translation bridge between the Quantum Computer and the AI. But the result should be massively beyond, in terms of both data complexity and performance for unstructured data, what a classical computer can accomplish.

Quantum Turbocharging?

AI by nature is performance limited, particularly with regard to unstructured data. What IBM proposes is kind of a Quantum Turbocharger for unstructured data, which could not only be applied to AIs but any computer solution that uses unstructured data by using a two-step approach. 

That approach starts with the Quantum computer structuring the data so it can be consumed more quickly, a NISQ computer further organizing the data so it can be consumed at the primary computers full speed, and that primary computer which may or may not be an AI. 

The result should be, according to IBM, logarithmically faster than anything we have on the market today and a true revolutionary game changer.   If you think things are moving fast now, just wait until this puppy ships in the mid to late 2020s.  

Similar articles

Latest Articles

IT Planning During a...

Without a doubt, 2020 changed everything. I like to compare it to a science fiction movie where time travel is involved. Clearly, we have...

Best Data Quality Tools...

Data quality is a critical issue in today’s data centers. The complexity of the Cloud continues to grow, leading to an increasing need for...

NVIDIA’s New Grace ARM/GPU...

This week is NVIDIA’s GTC, or GPU Technology Conference, and they likely should have changed the name to ATC because this year – it...

What is Data Segmentation?

Definition of Data Segmentation Data segmentation is the process of grouping your data into at least two subsets, although more separations may be necessary on...