Tuesday, September 25, 2012

Data Center Electricity Use in Proportion

Imagine if there were robots to do all our work for us — robots to harvest our crops, drive our cars, even write our blogs. We wouldn’t have to work anymore. Now imagine that the robots decided to take over the world: eat our food, sleep in our beds, write devastating insults on our Facebook pages. That wouldn’t be so wonderful, would it?

It’s an issue that bears thinking about because robots have needs and we want to make sure we don’t get crowded out ourselves. Things end badly in the very first robot story, the 1921 Karel Čapek play R.U.R.: the robot army scours the world and wipes out the human race. If robots are that costly, we might decide that we don’t want them. In more general terms, it is important to be aware of the costs of any technology we rely on. Often enough, as in R.U.R., we avoid thinking about it, afraid that we will find out that the costs are untenable.

That basic fear is what gives rise to the occasional scare story about the high costs of operating computers, like the newspaper story on cloud data center electricity usage that everyone is talking about this week.

In analyzing resource usage, it is important to maintain proportion by keeping everything inside some kind of frame. Otherwise, with a few simplifying assumptions, you could easily calculate that the resource usage you are looking at is larger than the entire world. If you are attempting a back-of-the-envelope estimate of computer electricity usage, you could easily make any of a number of mistakes, confusing:

  • Capacity, peak usage, and average usage. A computer system might be rated to use 230 watts, but actually use 165 watts at peak and 45 watts on average.
  • System, device, and component. A central processor may use a small fraction of the power consumption of a computer, which in turn is a fraction of the total for a computer system.
  • Front end and back end. Every web site is located in a data center, but that doesn’t mean every web site has its own data center, or even its own server.
  • Technology generations. Every few years, the old hardware devices are replaced with new ones that have very different physical and operating characteristics. A facility planned for 2013 will not resemble the operating characteristics of one from 2006.

The news stories this week are misleading to begin with, but the discussion surrounding them is on another planet. People are coming away with the impression that data centers employ more than 2 percent of total global energy usage. To bring the wretched excesses back into the realm of reason, it helps to apply a context as a frame for the analysis.

  • Industrial scale. What percent of the large industrial buildings in the world are data centers? —A small fraction of 1 percent?
  • Electricity supply units. You’re not talking about electricity supply planning in a country like the United States until you’re talking about gigawatts at least. When you see a chart showing gigawatt-hours per year, you know someone is trying to slip a much smaller unit past you.
  • Capacity trends. If usage trends are actually going up, then eventually, facilities will have to add capacity. For data center electricity, this would mean crews running new electricity supply lines to the building. But this is the rare exception. Instead, most data centers use less electricity than they did when they were first installed, the result of more efficient hardware. With computing capacity and efficiency both increasing every year, resource totals like electricity and floor space are probably declining, though it is hard to forecast.

Data center electricity usage might not be the big problem that people are imagining this week, but it is still nice to be efficient about it. Aerial photos show places where power-intensive data centers are located within a stone’s throw of power sources. That helps, though the energy consumption of workers going to work at the facility must also be considered. On a personal level, we can minimize our use of data center capacity by avoiding peak periods. This could mean doing remote backups and installations in the evening (west coast) or early morning (east coast). Taking the time to throw away unwanted media files, especially photos and movies, can make a difference. Waiting until day 2 to download a hot software release makes the biggest difference.

But we must not confuse this level of energy saving with the savings that come from home energy audits, more efficient refrigerators, LED room lighting, and regular maintenance for fuel-burning cars — and these, in turn, are dwarfed by the potential of electric cars and rooftop solar installations. In the United States, if a huge data center uses 50 megawatts, your share of that power consumption is about one fifth of a watt, not really enough to worry about. Choosing a high-efficiency light bulb might save you 10 or 15 watts. I would rather have you pay attention to those energy-saving opportunities first.