Typically energy efficiency is seen as the reduction of metered energy use resulting from energy programs, while maintaining the same level of comfort, services, and production. In today’s more complex energy world, this simplistic definition can lead to decisions that may not make the most sense in the long term. It seems timely to take a look at broadening how efficiency is viewed.
Defining efficiency at the site meters ignores energy used to generate and transport electricity. This is typically twice the amount of energy as the resulting electricity itself, and represents fuel that has been paid for by the end user. If efficiency is redefined relative to source energy, or primary fuel, we will capture the broader impact of our efficiency programs. When viewing energy in this way, it becomes clear that cutting electricity use has a much greater impact on source energy than efficiency programs aimed at reducing natural gas usage.
Measuring efficiency in terms of both site and source energy gives a more realistic picture of the overall impacts of efficiency measures. A source energy view is arguably a stronger basis for estimating future risks since it relates more directly to fuel prices and choices, along with regional transmission investments and reliability. It also will highlight major differences in risk from site to site. If the energy strategy includes greenhouse gas reduction targets, a source energy view is essential. If site energy is only considered, measures like on-site heat and power may appear to add emissions, despite the fact that overall they may reduce emissions if source energy measures are used.
Using source energy to assess efficiency is only part of the story. It works well as long as the grid is broadly fueled by coal, gas, or uranium. This is rapidly changing around the world as renewable, mostly hydro, solar, and wind generation, are being added to the grids. As renewables becomes more significant, the value of source energy as a reference point for efficiency and risk changes. It also changes the contribution of on-site measures to greenhouse reduction. In some cases, measures that historically would have reduced overall emissions may increase them.
|Peter Garforth heads a specialist consultancy based in Toledo, Ohio and Brussels, Belgium. He advises major companies, cities, communities, property developers and policy makers on developing competitive approaches that reduce the economic and environmental impact of energy use. Peter has long been interested in energy productivity as a profitable business opportunity and has a considerable track record establishing successful businesses and programs in the US, Canada, Western and Eastern Europe, Indonesia, India, Brazil and China. Peter is a published author, has been a traveling professor at the University of Indiana at Purdue, and is well connected in the energy productivity business sector and regulatory community around the world. He can be reached at firstname.lastname@example.org.
|Subscribe to the Energy Expert RSS feed|
The picture becomes even more muddied as renewables find their way onto the site itself. As long as efficiency continues to refer to only the meter, renewable supply within the perimeter gets counted as efficiency. This is clearly a distortion since the overall energy use on the site has remained the same; simply some of the off-site supply is now being supplied on site. When referencing efficiency to source energy, or primary fuel, the on-site renewables have an even bigger impact on reported efficiency despite the fact that the end-use efficiency is unchanged.
This introduces the idea of a rigorous end-use definition of energy efficiency. This measures the actual energy used in the final manufacturing processes and in the heating, cooling, and lighting of buildings and in the IT systems, irrespective of how that energy is supplied. This is a definition that is often the most challenging to capture but is arguably the most significant measure of the energy effectiveness of the company’s processes. Comprehensive sub-metering is a pre-requisite to measure and manage end-use efficiency and is all too often a major missing element in energy programs.
In addition to the absence of sub-metering, our own habits and perceptions get in the way of using end-use efficiency as a primary measure. We tend to think in terms of natural gas or oil to drive boilers, not heating. We instinctively jump to electricity to drive air conditioners, not cooling. To truly think in terms of energy end-use, rather than the utility that creates them, is a shift in mindset that takes time and adjustment.
Measuring efficiency in terms of source energy and primary fuels is a powerful way to evaluate the impact of efficiency programs on the environment, along with future cost and reliability risks. Measuring it in terms of energy end-use independent of supply choices is a powerful way of measuring the long-term energy competiveness of our buildings and production lines. Measuring efficiency at the utility meters is a compromise between these two, frequently driving short-term energy decisions that lack a sound strategic basis.
So why do we so often default to the utility meters as our gold-standard reference for efficiency, despite the obvious shortcomings? The answer is obvious — because they are there and easy to understand. In the future, industrial energy managers will be called upon to manage ever more subtle balances of fuel and renewable choices, process efficiencies, and multiple risks. We will need to understand efficiencies from end-use to fuel. This will challenge us to rethink our definitions. The place of the utility meter in the energy value chain is an accident of history. This needs to be remembered if we are to be open to more flexible ideas of energy efficiency.