Sunday, July 25, 2021

Big Blue’s Baby Blue Gene Supercomputer

You can call it a “Mini-Me” supercomputer but just don’t call it Blue Gene
Lite. IBM’s researchers don’t like that very much.

IBM unveiled a smaller version of its highly-touted Blue Gene supercomputer
Friday just in time to crack into the coveted Top 500 Supercomputing list
and days before the annual Supercomputing 2003 event in Phoenix, which begins Monday.

While the full Linux-based supercomputer won’t be finished until 2005, the
Armonk, N.Y. company has warmed up to the final bow by producing Blue
Gene/L, a much smaller version of the project that is about the size of a
30-inch screen television. But, said IBM Deep Computing Vice President Dave
Turek, don’t be fooled by the petite nature of the product in a realm ruled
by machines that fill whole rooms.

Blue Gene/L has been clocked at a peak speed of 2 teraflops, or 2 trillion
mathematical operations per second, and currently sits at number 73 in the
current Top 500 list, which will be formally issued at the event next week.

Turek said the computer is just a prototype at this point and is not geared
for commercial rollout. But despite the fact that it only measures about one
cubic meter, Turek said the machine stacks up favorably with most machines
with comparable compute power, such as Linux clusters offered by Dell or
SGI. Turek said most 2 teraflop supercomputers fill up entire rooms, often
with more than a dozen racks.

“This is really a marriage of two ideas,” Turek told
internetnews.com. “Customers want amplified computer power but have
legitimate concerns over space and cooling. We think the days of people
building new buildings for computers have gone by the wayside. By
progressively compressing computing in these form factors, we will be able
to deliver supercomputing into the hands of customers such as small and
medium-sized business who could not accommodate it today because they don’t
have the space for massive machines.”

“What IBM has here is a research project for the Department of Energy that
is set to scale to a massive size yet provide a compact footprint for low
power consumption while providing high performance,” DH Brown analyst Rich
Partridge said.

Partridge said the news is proof that IBM has a physical product that works
and has been recognized by the prestigious supercomputing list, bit
cautioned that is not a commercial viability at this point. “This is not
just a paper tiger — it’s a working, functional product.”

So, could this mini Blue Gene/L parallel the advancement of thin, yet
powerful form factors like blade servers, which have caught on with big
systems vendors like IBM, Dell, HP and Sun Microsystems in recent years?
That is, could IBM’s work bring on a raft of smaller, modular supercomputers
from the aforementioned competitors?

Turek said no, because supercomputing on its face is too complex for rivals
to closely approximate what another rival is doing. There is too little
commonality associated with the architectures, which would be a requirement
for the commoditization of mini supercomputers.

“With the notion of commoditization comes the behavior of certain market
dynamics,” Turek said. “There is the fundamental economics of technology to
consider but also the acceptance of how it’s presented. If you decide you
want to make toothpicks made out of thread instead of wood, they may be
cheaper but the demand wouldn’t necessarily drive the supply. It has to be
something that people fundamentally want.”

Moreover, Turek said IBM and rivals face the tough task of scaling up, which
he said is never easy for supercomputers because there are a number of
barriers associated with adding power or nodes to already massive computers.

“There comes a point in scaling up that you run into a wall if you haven’t
thought it through before hand,” Turek said.

Turek said the original Blue Gene plan, when it was unveiled in 1999, called
for a forest of computers in racks or cubes that could be linked together as
sort of a “modular” supercomputer. The mini BlueGene/L is a scaled-down
sample of what the ultimate machine will be like, which is being built for
the Lawrence Livermore National Laboratory in California.

To date, Blue Gene/L has been used to map protein folding in the life
sciences space, but Turek envisions it will have the potential to be used
for any number of industries, including digital entertainment and financial
services.

Partridge said it is possible Blue Gene/L could be commercialized for areas
such as seismic engineering or fluid dynamics, and sell some units among the
high performance technical computing arena, but wondered whether or not it
wouldn’t interview with IBM’s current Linux supercluster products, powered
by Opteron, Xeon and Itanium processors.

By 2005, Turek said Blue Gene/L will consist of a group of machines that
will be 128 times larger than today’s prototype and occupy 64 full racks. It
is expected to operate at about 360 teraflops LLNL researchers hope to use
Blue Gene/L to investigate areas such as cosmology and the behavior of
stellar binary pairs, laser-plasma interactions, and the behavior of nuclear
explosives.

To wit, Blue Gene/L is part of the National Nuclear Security Administration
(NNSA)’s Advanced Simulation and Computing (ASC) Program.

Similar articles

Latest Articles

Data Science Market Trends...

When famed mathematician John W. Tukey postulated that advanced computing would have a profound effect on data analysis, he probably didn’t imagine the full...

Data Recovery Market Trends...

Data recovery is more important than ever in this era of constant cyber attacks and ransomware. The Verizon Data Breach Investigations Report (DBIR) looked...

Trends in Data Visualization

In a world of big data, visualization is becoming a key skill set that every business must master.  Digital technology has transformed the way businesses...

Microsoft Data Portfolio Review

With a host of analytics services for almost any situation, Microsoft Azure’s data services have got just about every base covered.   In the world...