Sunday, June 16, 2024

IBM Goes All In On Fight Against Covid-19: The Tech Industry Goes To War

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

IBM had earlier announced they were pivoting the most potent Supercomputer in the world, Summit, to fight Covid-19. The result was that they have been able, with Oak Ridge Laboratory and the University of Tennessee, to screen 8,000 compounds to find out which ones were likely to mitigate Covid-19. 

They were looking to see which compounds could best bind to the central “spike” protean of the coronavirus, making it unable to infect host cells.  In effect, they are looking for what would render the virus ineffective against anyone medicated with one of these compounds.  The effort surfaced 77 promising small-drug compounds that potentially make Covid-19 wholly or partially ineffective.  These compounds are now undergoing testing. 

This week IBM announced they were adding another 16 additional systems, that’s 330 petaflops, 775K CPU cores, 34,000 GPUS, in collaboration with the White House Office of Science and Technology Policy, and the US Department of Energy through a massive technology pool.  This pool includes Lawrence Livermore National Lab (LLNL), Argonne National Lab (ANL), Oak Ridge National Laboratory (ORNL), Sandia National Laboratory (SNL), Los Alamos National Laboratory (LANL), the National Science Foundation (NSF), NASA, the Massachusetts Institute of Technology (MIT), Rensselaer Polytechnic Institute (RPI), and multiple other leading technology companies.

This move is the kind of effort you typically only see during massive scale war, and it is certainly indicative of how seriously some of the major technology players are taking this

This is the first time this lever of computational power has been applied to a single problem. 

The Application Of Overwhelming Computational Power

Now a few years back, even if we had this kind of computational power available, it would have been challenging to rapidly apply it to something it wasn’t designed to do.

These systems were generally designed to look at massive problems that revolved around predicting weather patterns globally, the interactions between galaxies, and attempting to discover Dark Matter.   You typically can’t take something that was designed uniquely to work on one class of problem and pivot it very quickly to another. 

However, over the last decade, IBM has mostly re-engineered how these huge systems function and the need for these systems to be able to handle a wide variety of problems so that their massive cost can be spread between more entities have been advanced. 

Even the way we connect systems like this into a solution has undergone a rather revolutionary process.  It used to be that when you had various systems working with unstructured data, all of which needed to be collated to find an answer you’d need to re-analyze the date from scratch with whatever computational tool you had. 

But now the process is to allow the various systems to do the analysis they were designed for. Then you have the master system analyze the results, which cuts a massive amount of time out of finding an answer and reduces the computational power needed to arrive at a solution. 

Now, if you combine the massive additional computational power with this change in the process you get a multiplicative impact where the combination should result in far more reliable results considerably faster.  One of the things I expect they’ll be looking at is easier to determine early markers for someone who is sick that can scale. 

For instance, we know that one of the first symptoms is a loss of taste.  Assuming everyone could be tested when well to set a baseline, you should be able to rapidly test a lot of people for a loss of taste if you had a test focused on this. 

Anyone that couldn’t taste what they tasted during the baseline would immediately be flagged as a potential carrier and removed until more detailed testing could be done (that testing time is dropping rapidly as well) and the combination of a rapid, cheap test coupled with a quick, effective quarantine process could get the nations back to work again avoiding much of the massive financial crisis that we currently anticipate.  

Addressing the Crisis

IBM is one of several companies stepping up sharply to address this crisis. IBM’s approach begins with what they do the best analysis of unstructured data at scale, and their engagement increases the likelihood that we all can get back to something approaching a normal life this year, and before the economic damage becomes the bigger problem. 

This use will also begin to establish best practices for the next Pandemic, and there will be the next one, hopefully assuring that this Pandemic will be the last with this level of global damage. It is also a showcase of how technology can be effectively be rapidly repositioned to address tactical problems of massive scale, making these applications critical for dealing with the many anticipated threats we will face as a race going forward.  

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles