They include: the IBM Blade server; T-Platforms (the firm is the largest HPC provider in Russia) TB2; and the Cray XE6 Supercomputer.
These join similar products from Dell, HP, and other vendors that are already in market. This isnt because folks in data centers are suddenly playing World of Warcraft on servers; theyve been playing on their laptops for years. Instead, it reflects a change in how some types of work is being done in the enterprise.
Since Im at this conference this week, lets talk about this.
The applications that benefit from this newer approach to computing are those that are massively multi-threaded. This lends itself to complex numerical computation, things like molecular dynamics and simulation, and, as you would expect, imaging.
Examples here at the show appear to favor tasks that would have historically required massively expensive super-computers. One of the examples they provided, with regard to supercomputing loads, was that 8 Fermi (NVIDIAs current graphics technology) processors currently can do more work in this class that 192 quad core CPUs over the same time period.
The visual image the firm showcased, which underscores this, is that a small server configuration taking up about half of a single rack can outperform a large CPU based Supercomputer for some tasks of this type. (The picture was of the Kraken Supercomputer, which means the Super Computer folks are embracing this technology as well).
But this is just the potential of this technology; the interesting part of this show is what is being done with it.
Pharmaceuticals: Drug design (and chemical analysis in general) has proven to be a very powerful use of this new technology. Massive amounts of data have to be crunched to not only determine if the drugs do what they are supposed to but dont things we dont want them to do.
The systems are used to analyze molecular dynamics to determine what the interaction actually is. Making this kind of computational power more affordable is bringing new, better, and safer drugs to market much more quickly.
For me this goes to the core of why this is interesting as it could, in fact, save my life.
Geology: On that same lifesaving angle, I live in California where earthquakes are a way of life, and unlike hurricanes your only notice is the rapidly approaching roof collapsing over your head. Here GPU technology is being used to do Seismic Imaging particularly with regard to Reverse Time Migration so that an earthquake can be better modeled and the resulting damage better anticipated.
By understanding this better not only do predictions become more possible, but building requirements can better match potential risks.
Automotive and Aircraft Design: Here GPU computing is used for Computational Fluid Dynamics to create more fuel efficient cars and planes and likely safer ones as well. Given a world that is addicted to oil and where our lives depend on these vehicles, the ability to get access to processing power like this is helping on the two vectors of safety and efficiency.
Medical Imaging: There were a number of demonstrations on this, with the most interesting being a forward-looking talk about an effort to allow doctors to effectively shrink themselves down and explore a patent from inside (making the movie Fantastic Voyage suddenly more real).
This imaging capability coupled with ever smaller remote controlled surgical robotics can not only potentially identify things like tumors more effectively but potentially eliminate those that are currently inoperable.
Trading: Here is one place where gaming theory isnt talking about how to beat an elf in an online multi-player game. Instead its focusing on analysis like Monte Carlo which allows traders to more accurately and quickly determine security pricing trends and respond to them.
Here it isnt the safety of your body but that of your retirement portfolio that is being protected. And for those of us who arent billionaires (and for those billionaires who want to remain rich), this is incredibly important.
Weather Forecasting: As with most of these examples, this too has to do with modeling. By being able to analyze a massive number of sensors real time and compare that data to prior events, storms that have not yet become visible can be better anticipated.
This could dramatically reduce the potential life risk from a storm like Katrina and, coupled with other modeling, signal the need for an evacuation far more quickly.
But the coolest thing here is called the Photonic Fence , which is kind of a laser defense system funded by Bill Gates foundation that, and I kid you not, shoots Mosquitoes out of the air.
And it can even tell the difference, and Im still not joking, male and female Mosquitoes. It needs some kind of a voice though, something like Incoming Threat detected, Gender Analyzed, Deploying Laser, Launching, Threat Eliminated to make it work the way I want in my own yard. While this tool is being used to reduce the threat of diseases like Malaria you would think similar technology could also be used to, I dont know, to actually get a missile defense system to work.
In any case, graphics technology is moving into the enterprise. Where it works it is saving lives and making money (note my priority) and it is hard to argue either is a bad thing.
Oh, and if you think this change was coming fast, NVIDIA announced they are accelerating it this week.