Tuesday, October 8, 2024

Improving Data Center Efficiency: Five Steps

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

In the first week of September last year, I explained, Why You Need a Fuller Data Center and challenged you to partake in some data center minimizing activities. One of those activities was workload (server) consolidation. If workload consolidation wasn’t on your 2009 to-do list, I’d put money on the probability of it being on there this year. Workload consolidation is one way to decrease the number of servers sucking power for systems delivering diminishing returns for their maintenance as opposed to productivity cost.

Workload consolidation involves analyzing system performance and combining the workloads of underutilized systems to create a more efficient data center. For example, if you have 10 web servers all humming along at 15 to 20 percent CPU and memory utilization on each system, from a practical standpoint, those systems are idle. Their low impact workloads present you with the opportunity to combine them into a pair of highly available systems whose average utilization will hover in the 65 to 75 percent range. Peak utilization might reach 95 percent at times, but the average utilization range is a comfortable target for which to aim.

This week, I present five steps to a more efficient data center through workload consolidation.

Step 1: Collect Data

The first step in this process is to collect system performance data. This step is likely to take the longest amount of time to perform. You don’t want performance snapshots but rather a full picture of performance trends. You must gather enough data so that you can see hourly trends, day of week trends and even monthly trends. A year’s worth of data is a reasonable amount of time to gather the information you need. If you already have this data, then you’re ahead of the game and you may proceed to the next step: Data Analysis.

If you haven’t gathered system performance data, you must engage your staff to do so. There are numerous tools available for gathering this data, but the free, open source product called Orca is a good example of the kind of product and data collector you need for this activity. If time permits, collect data for at least two weeks before attempting any data analysis in the next step.

Step 2: Data Analysis

After you’ve collected enough data, it’s time to analyze that data. It is this analysis upon which you’ll create your workload consolidation plan in Step Three. Fortunately, tools like Orca have a strong visual, as well as a strong numeric component to them. The hourly, daily, weekly, monthly, quarterly and yearly graphs offer great insight into your system’s performance at a glance. You will not need your calculator to visualize positive or negative trends.

Read the rest at ServerWatch.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles