Thursday, July 29, 2021

Interop: What Are Your Datacenter Metrics?

NEW YORK — Datacenters. Every big enterprise has them, but how many
actually have solid metrics to determine the value of their datacenter?

In a session at Interop, Andreas Antonopoulos, senior vice president and founding partner at Nemertes Research, asked participants how they measure their datacenters metrics. He noted that the metrics people use tell you a lot about their
role or how they think of the datacenter. It can be thought of in terms of
servers, square footage, CPUs and the number of CPU cores. Yet there is
another key metric that must always be put into the equation and measured
against all the others — power.

“97 percent of people we surveyed had no clue about how much power they
used in terms of cost,” he said. “The problem is power was almost
free until five years ago, but that’s not the case anymore. Now power costs.”

So what’s a datacenter manager to do? Antonopoulos argued that we should
all follow Google, Yahoo and Microsoft (GYM) and build datacenters far away
from dense urban areas, which tend to have higher energy costs, though the
availability of IT staff can sometimes be an issue.

“Why is Google in South Carolina?” Antonopoulos asked. “Chinese T-shirts.
South Carolina used to be a world center for cotton mills, but China devastated
that industry, and so South Carolina has lots of power stations with spare
capacity.”

The other problem in measuring datacenter metrics is the fact that most
current datacenters were built for peak level of demand. It’s an design
that Antonopoulos argued is not predictable and makes the datacenter
inflexible and inefficient.

The solution is to move from a design architecture to a runtime
architecture using provisioning tools and virtualization where servers can
be repurposed and reallocated as needed.

Antonopoulos noted that he’s seen datacenters waste power while idling waiting for peak capacities. As a rough estimate he noted that by spooling up servers and resources as required, instead of merely provisioning for peak capacity, datacenters could save as much as 30 percent of power requirements.

Instead of tiered datacenter structures where Web, application and database servers all exist in separate silos, Antonopoulos strongly advocates a
more flexible approach there as well. He argued that network architecture
should be flat and simple with fewer layers, simpler design, lower latency,
fewer hops and higher capacity.

The key is virtualization, which enables better utilization and the
ability to move servers around as needed. Virtualization is a particularly
hot topic this week as VMware, Cisco (NASDAQ: CSCO) and others roll out new initiatives.

This article was first published on InternetNews.com. To read the full article, click here.

Similar articles

Latest Articles

Data Science Market Trends...

When famed mathematician John W. Tukey postulated that advanced computing would have a profound effect on data analysis, he probably didn’t imagine the full...

Data Recovery Market Trends...

Data recovery is more important than ever in this era of constant cyber attacks and ransomware. The Verizon Data Breach Investigations Report (DBIR) looked...

Trends in Data Visualization

In a world of big data, visualization is becoming a key skill set that every business must master.  Digital technology has transformed the way businesses...

Microsoft Data Portfolio Review

With a host of analytics services for almost any situation, Microsoft Azure’s data services have got just about every base covered.   In the world...