How are you controlling your data center costs?

November 30, 2016


By Bill Henneberry Vice President Technology and Operations

Power usage effectiveness (PUE) is simply the ratio of total data center energy consumption to the amount of energy consumed by the IT equipment alone. Since the data center’s primary purpose is to supply IT resources to a company or consumers, an ideal PUE is 1.0: all energy consumed by the data center goes to the IT equipment. That would mean no energy is used or lost in the power distribution system (including cables, uninterruptible power supplies, voltage conversion stages and so on), security system, lighting, heating/cooling or any other non-IT portion of the data center. Obviously, then, a PUE of 1.0 is unattainable, since it requires 100% efficiency.

Of course, simply looking at the definition of PUE leaves some grey areas. What if a company has or some other on-site power generation infrastructure at its data center source, for instance? Does that power count toward the total power consumed by the facility? Should peripheral power drains like lighting and security really be included in the PUE calculation? The Green Grid, the organization that developed the PUE metric, provides some guidance on how it is to be properly calculated, but unscrupulous companies can easily “cook the books” to make their facilities look more efficient than they really are.

Thus, PUE measurements should always be taken with a grain of salt, particularly when no details are provided regarding how they are calculated. If you ever see a PUE of exactly 1.0 or less, you can be certain you’re being lied to. But what about a PUE of 1.001? Believe it only if you think that only 0.1% of every watt delivered to the data center’s servers is lost in power distribution inefficiencies—to say nothing of lighting or anything else happening in the facility. In other words, don’t believe it. But for a company honestly seeking to improve its energy efficiency (not just its public image), PUE can be a helpful metric. Here’s some ways to manage it.

The bottom-line, it comes down to the design criteria and how you ultimately control and operate your facility. Take baby steps like implementing cold aisle containment, installing blanking panels, Retrofit your lighting to LED, install temperature sensors close to your servers, so your mechanical solution reacts in real time. Simply keep track of your power draws and where possible peak shave your loads at key times.