Sunday, May 16, 2021

Gartner: Too Many Chips Spoil the Server Broth

With the number of chips per server and cores per chip increasing, future generations of servers may end up with way more processing power than the computer could possibly utilize, even under virtualization, Gartner has found. The research firm issued a report on the issue earlier this week.

This doubling and doubling again of cores will drive the servers well above the peak levels for which software systems are engineered. That includes operating systems, middleware, virtualization tools and applications. The result could be a return to servers with single-digit utilization levels.

The problem is that the computer industry resides and depends on constant upgrades. It’s not like consumer electronics, for example, where stereo technology remained unchanged for decades. The computer industry is driven by Moore’s Law (define), and that means Intel has to keep selling chips and OEMs have to keep selling servers.

“Their whole business model is driven on delivering more for the same price,” said Carl Claunch, vice president and distinguished analyst at Gartner. “They have to keep delivering on the refresh rate, and you have to be constantly delivering something new.”

And fast chips are more glamorous than working on the subsystem, which has lagged when compared to processor performance. Memory and I/O buses are much slower than the CPU, causing bottlenecks on a single PC. On a virtualized system, it can be even worse.

So with Intel (NASDAQ: INTC) flooring the gas pedal on driving new products, vendors like IBM (NYSE: IBM), Dell (NYSE: DELL) and HP (NYSE: HPQ) have no choice but to follow to get revenue from product refresh sales. “When someone does take their foot off the gas it will be a train wreck, because so much is dependent on that rate of refresh and speed of improvement,” said Claunch.

Ed Turkel, manager of the Scalable Computing & Infrastructure unit at HP, seemed to concur. “Due to the more compute power available with multi-core systems, the applications may need to be re-implemented to fully take advantage of the compute power available to them,” he said in an e-mail to InternetNews.com.

“This issue is commonplace in high performance computing today, but we will start to see this as an issue in other segments. For instance, virtualization environments will also need to become more multi-core-aware, perhaps creating virtual machines that virtualize multiple cores into a single machine that hides this added complexity.”

Sockets, chips and cores, oh my!

Currently, the most popular server motherboards have two to four sockets, with dual socket being the most popular, according to Intel. Anything above four sockets is labeled as a “multi processor” (MP) server, but those are very rare and only used in extremely high-end systems, accounting for single-digit market share.

It gets even more confusing on the processor side, as the return of multithreading in Intel’s Core i7 (“Nehalem”) means one core appears as two when running two separate threads.

So far, Intel has launched the six-core Xeon and AMD has a six-core Opteron in the works. Intel plans for an eight-core Core i7 (“Nehalem”) for servers, which will run two threads per core, and AMD is planning for a 12-core server in 2011.

If motherboard makers start going to 8, 16 or 32 socket motherboards, it could be possible to see 256-core machines. With 12 and 16-core processors, that could hit 512 cores, and so on in the coming years.

This article was first published on InternetNews.com. To read the full article, click here.

Similar articles

Latest Articles

How IBM has Changed...

Think is IBM’s big annual conference, and again this year, it was digital. I’m noticing a sharp quality difference in shows like this where...

Database-Tuning Platform Launches and...

PITTSBURGH — A team out of Carnegie Mellon University is launching its automatic database-tuning product today with the help of $2.5 million in funding.   OtterTune,...

Top 10 Professional Services...

Professional services automation (PSA) software aims to offer service-based companies most of the software they will need to run their businesses in one package....

What is Data Aggregation?

Data aggregation is the process where raw data is gathered and presented in a summarized format for statistical analysis. The data may be gathered...