Our ever-evolving digital lifestyle has changed the way we do many things, from online banking to researching school papers to finding a soul mate to buying airline tickets and homes.
The consequence of that is that energy consumption by datacenters in the U.S. has doubled over the past five years and that growth rate shows no signs of slowing down.
That’s part of the price of progress, both in terms of energy consumption and contributions to climate change, according to a new report from the Environmental Protection Agency (EPA) to Congress.
In 2006 alone, U.S. datacenters consumed about 61 billion kilowatt-hours (kWh) of electricity or roughly 1.5 percent of all U.S. electricity consumption. That much energy cost about $4.5 billion, according to the report, prosaically titled “Report to Congress on Server and Datacenter Energy Efficiency.” The report was assembled under the auspices of the EPA’s Energy Star program.
Of course, Google (Quote), Yahoo (Quote), Microsoft (Quote), and other large search vendors and free e-mail purveyors and their users are only partly to blame. Datacenters are required to run businesses as well as government.
But the problem is growing rapidly.
The estimated level of consumption in 2006 was “more than the electricity consumed by the nation’s color televisions and similar to the amount of electricity consumed by approximately 5.8 million average U.S. households,” the report states.
In fact, by 2011, electrical consumption by datacenters is projected to nearly double again to 100 billion kWh, costing $7.4 billion, and rising to about 2.5 percent of all U.S. consumption.
“As the U.S. economy increasingly shifts from paper-based to digital information management, datacenters have become a vital part of business, communication, academic, and governmental systems. Over the last five years the increase in use of these systems, and the power and cooling infrastructure that supports them, have doubled energy use, increased greenhouse gas emissions and raised concerns about power grid reliability,” the report continues.
And it’s not all just about powering racks of computers, but also about cooling them in order to remove excess heat.
“For every watt consumed on the [computing] side, you have to have a watt on the cooling side,” Andrew Fanara, team lead for product development in the Energy Star program, said on a conference call on Thursday.
Overall, while the report is not a “how-to” document, Fanara said, the best near-term medicine for the problem is for IT staffs and server manufacturers to work to lower electrical demands.
For instance, datacenters should consider sub-metering consumption on each IT equipment rack separately in order to better track usage, and implement more server consolidation. They should also pull the plug on unused or less efficient equipment. Manufacturers can help by using the most efficient and current technologies.
Organizations should also implement best practices for datacenter construction and management. And the industry should agree on common benchmarks for hardware.
“[Increasing] efficiency is the best resource we have available,” Fanara said.
According to the report, those practices and guidelines, if closely followed could yield power savings such that 2011 consumption could be reduced to pre-2006 levels, without limiting the power of the servers, which would run counter to the reasons for deploying a large datacenter, in the first place.
That will not be easy, the report’s authors admit.
However, among other things the report recommends creation of a standardized whole-building performance rating system for datacenters, as well as development of Energy Star specifications for servers. It also advocates encouraging electric utilities to offer financial incentives for datacenter efficiency improvements, and more use of public/private partnerships to jointly solve problems.
The report and more information are available here.
This article was first published on InternetNews.com.