UPDATE 1/20: See the bottom of the post.
A group called CO2scorecard.org, whose efforts to compile energy data I have praised in the past, has issued a report which argues that so-called “energy rebound” at the micro-level might be in the range of 30% or less rather than the higher levels that have been argued by my colleagues at The Breakthrough Institute. While longtime readers of this blog and readers of The Climate Fix will know that I think that the debate over the rebound effect is largely inconsequential to the debate over efforts to decarbonize the economy, the report and reaction to it provide a great opportunity to highlight a key intellectual challenge that we all face when overwhelmed with information – beware promoting bad analyses simply because they accord with your tribal convictions.
The CO2scorecard.org report contains a fundamental error that will be instantly obvious to anyone familiar with energy data – they confuse energy intensity and energy efficiency. What is the difference you might wonder? Here is how the US Department of Energy describes the difference:
Energy Intensity is measured by the quantity of energy required per unit output or activity, so that using less energy to produce a product reduces the intensity.When all else is equal, energy intensity – Energy/Output – is simply the inverse of energy efficiency – Output/Energy. But when all else is not equal things get tricky, as the DOE explains:
Energy Efficiency improves when a given level of service is provided with reduced amounts of energy inputs or services are enhanced for a given amount of energy input.
Declines in energy intensity are a proxy for efficiency improvements, provided a) energy intensity is represented at an appropriate level of disaggregation to provide meaningful interpretation, and b) other explanatory and behavioral factors are isolated and accounted for.What might those other explanatory factors be? Again, DOE:
Other explanatory factors cause changes in the energy use that have no bearing on the efficiency with which energy is used. These changes may be structural, they may be behavioral, or they may be due to factors, such as the weather, over which we have no control. These are sometimes collectively referred to as structural elements and they give rise to a change in energy use per unit measure of output, but do not reflect improvements in the underlying efficiency of energy use.Now back to the CO2scorecard.org report. In one of the examples in their report, they argue that the lack of trend in energy intensity of metal mining necessarily implies that there have been no efficiency improvements in that industry, writing:
Energy intensity is expected to worsen or remain stagnant over time as each mine extracts its metal from a lower grade ore at the margin (ore bodies have steadily declined in grade for decades, as high-grade ore-bodies have been depleted by mining). . . metal mining typically suffers from falling or stagnant energy efficiency.I emailed the lead author of the report and he confirmed that throughout their analysis they do indeed equate energy efficiency with (the inverse of) energy intensity, just as they did in the metal mining example.
But in the first sentence of their report that I excerpted above is the key problem – there have indeed been “structural changes” in metal mining as the grade of ore has declined over time. Yet, even with this decline the same energy input has led to almost the same metal output from the ore. Thus, it is perfectly conceivable that as energy per unit output remains constant, the efficiency of extraction has improved as ore quality declines. Consequently, there may in fact have been energy efficiency gains while energy intensity has remained constant, but you just can’t tell that from the analysis of energy intensity. Whoops, big mistake by CO2scorecard.org.
Harry Saunders, author of the paper that CO2scorecard seeks to critique, told me by email that this is only one of many confounding structural factors at play in such data used in the paper:
The critical point here is that it is fundamentally impossible to discern from intensity trends what energy efficiency gains have occurred. On top of this, to then believe it is possible to discern the rebound effects hidden in these trends could kindly be called a fool’s errand.I don’t need to know much more than the fact that the authors don’t know the difference between intensity and efficiency to dismiss the report as a poorly done analysis, which is a common-enough occurrence on the web.
There are too many drivers of energy intensity at work, all operating in different ways. For example, changes in energy intensity are driven not just by energy efficiency gains but by movements in energy prices. Worse, they are also driven by price movements in all other factors of production. Worse still, they are driven by technology gains for all other factors. Without knowing these, it is impossible to know how energy efficiency has evolved in any particular sector.
But the thorniest problem is that one cannot measure rebound effects without evaluating two counterfactuals: what energy use would have looked like in the absence of any energy efficiency gains, and what energy use would have looked like had energy efficiency gains “taken” on a one-for one basis. Only with these in hand can one make any definitive statements about rebound magnitudes. One certainly cannot do it by looking at a single trajectory of energy demand, let alone a single trajectory of energy intensity.
If it were just an error-riddled report on the internet it might be worth ignoring. But CO2scorecard.org use the report to dismiss an entire peer-reviewed literature and to attack the motives of those they disagree with – specifically Harry Saunders and the Breakthrough Institute (where I, along with Saunders, am a Senior Fellow), who recently did an excellent literature review of the subject of energy rebound. In their report the CO2scorecard.org folks decided to get nasty and make stuff up about the policy positions of BTI that bears no resemblance to reality, perhaps to help their analysis get picked up.
Not surprisingly then, the CO2scorecard.org report was favorably published by Joe Romm, who apparently liked it for who it attacked, but did not do his homework by reading and evaluating the report’s substance. Egg on Joe.
The likelihood of an obscure group revising an entire literature via a short internet report is of course possible, but pretty unlikely. But if CO2scorecard.org think that they are on to something important, they should do what scholars do and publish their work in the peer-reviewed literature.
I am confident that over the long run good arguments sort themselves out from bad ones. In the meantime, while there is no crime in being wrong, there is a big risk for those who accept bad arguments simply for tribal reasons, rather than the merit of what is being argued.
UPDATE 1/20: CO2scorecard.org have responded to this post. they state:Note: For those actually interested in the academic literature on rebound effects (not me!) please see this report of the EU (PDF) and this report from The Breakthrough Institute. And lest there be any confusion, I am a big supporter of energy efficiency, as I have written on extensively, such as this piece.
[T]he dataset used in Saunders 2010 is itself inappropriate for analysis of energy rebound.They later state:
[W]e used energy intensity measured in physical units at the same level of industry aggregate as Harry Saunders.If Sauders' data isn't appropriate for analysis of rebound, then why are they using it to make claims about the magnitude of energy rebound? Enough said on that. They then have some silly passage about me and grey literature and the IPCC in an effort to somehow change the subject. Snore.