30 April 2011

Peter Gleick Responds

Peter Gleick has graciously sent me for posting a response to my critique of his Huffington Post column.  Here is Peter's response, and below that you will find my rejoinder.  Thanks Peter!
My opening paragraph is not claiming attribution, but that the extreme events of the past week must remind us that the climate is worsening. I think that is undeniable. But moreover, the difficulty in attribution is not the same as proof there is no connection. Indeed, I think it likely that every single climatic event we see today is, to some growing but unquantified degree, influenced by the changing climate -- this is the classic attribution problem. Just as you might (and indeed I might) reject any definitive statement about attributing an effect of climate change on these recent events, I (and I would hope YOU) would reject any definitive statement claiming there is NO effect. As the good NY Times piece on this pointed out, we don't know enough about the dynamics, and the model resolutions are not fine enough to test.

My comment about deaths and destruction was not specific to tornadoes, but to climatic extremes overall, globally. Read the whole piece carefully. And it refers to what I believe is an inevitable growing (not declining) risk from these climatic extremes, which include floods, droughts, sea level rise, hurricanes, etc.

Here is a good example of the misrepresentations of deniers like Morano (in which camp I do NOT put you, of course): We see strawman arguments making fun of any reference to tsunamis, as though any climate scientist argues a connection between climate change and frequency of tsunamis. But there IS a connection: not of attribution, but of consequences. Deniers conveniently (for them if not for society) ignore the consequences of two similar-sized 20-foot tsunamis (for example), but one with a foot-higher sea level, hitting a 20.5 foot tsunami wall. In the first case, nothing; in the second case, disaster. That's the reality of future climate change and the important distinction related to threshhold events.

But I was also shocked at what I consider a gross misuse by you of the tornado death graphic at the very top of your blog, as though that graph was relevant to climatic trends. The historical number of deaths reflects not just tornado frequency and intensity, but location, population dynamics and trends, advanced early warning technology and experience, housing construction trends, and many other factors completely unrelated to climate. It is perfectly plausible to have a clear worsening climate signal and a trend of deaths going in the other direction. Your use of the graph was inappropriate and unsupportable, though it has certainly been adopted by the denier community.

[Finally, for the inevitable complaint about my use of the term "denier," I use it for those who use it to describe themselves, and if you want plenty of examples, I've got them.]
Pielke's rejoinder:
Thanks, Peter.  Here are a few reactions to your response.

1. You seem to want things both ways. You write that the tornadoes this week are a "reminder" that "our climate is worsening" which will lead to more "death, injury, and destruction."  Now you say that you are "not claiming attribution" but then maintain that "every single event" is influenced by climate change. I am sure that I am not alone in reading your commentary as making an explicit link between this week's tornadoes and human-caused climate change.

2. If you invoke tornadoes and climate change in the immediate aftermath of >300 deaths writing "we're affecting the climate; in turn, that will affect the weather; and that, in turn, will affect humans: with death, injury, and destruction" then you should expect people to interpret your post exactly as I have. A broader focus on deaths from extreme weather events around the world also has no scientific basis at this time for asserting a connection to human-caused climate change in any of these phenomena (e.g., PDF).  If you really want to defeat "the deniers" then my advice is to refrain from giving them such easy targets to shoot down.

3. At no point did I suggest or imply that a graph of loss of life from tornadoes can be used to say anything about human-caused climate change, much less "deny" it. If you are familiar with my work at all (and I assume that you are) then you will know that I have repeatedly argued that you cannot use trends in loss of life (much less the loss of life in one day) to say anything about climate trends or causality of those trends, as you do in your piece.  If you want to say something about climate trends, then look first at climate data and not messy impact data -- and here is what NOAA/NCDC says about trends in the strongest tornadoes that cause >70% of deaths in the US.
Of course we need to be careful interpreting such trends because tornado data is problematic for various reasons, which makes it very difficult to argue that human-caused climate change is making tornadoes worse. Remember that the IPCC defines climate change as a change in the statistics of weather over 30-50 years and longer. For extremes, rare by definition, such trend detection will all but certainly require much longer time periods.

I appreciate your engagement.

29 April 2011

Bill Hooke on Tornadoes

EDITOR'S NOTE:  The brilliant guest post below comes from Bill Hooke, Director of the American Meteorological Society's Policy Program, who first published it on his blog, Living on the Real World on April 23. Bill is a long-time friend, colleague and mentor.  This post, and his blog, deserves a broad readership. H/T: JG
Guessing Games (remember Battleship?), Tornadoes, and Lambert-St. Louis International Airport

Today [April 23] St. Louis suffers, but breathes a sigh of relief. Last night a storm moving through the area spawned high winds, hail, and one or (possibly) more tornadoes that destroyed dozens of homes and hit the main terminal building of Lambert-St. Louis International Airport, badly damaging the roof, and scattering glass throughout the structure. As of this morning, the airport remains closed indefinitely, with officials saying it will take days to put the facilities to rights. Miraculously, though several people were injured, no one was killed. Sounds trite, but it could have been worse.

Across the nation, the catastrophe is but the most recent of this spring. Recall that the United States is really the tornado capital of the world; only 10% of tornadoes occur anywhere else around the globe, and they’re the weaker ones. And this year, tornadoes have been in our news for weeks.

Tornadoes present a unique challenge to the public – that would be the some 200 million of us who live in harm’s way – to policymakers, and to hazards managers. Think about it. The strongest tornadoes pack winds approaching 300 miles per hour. That’s twice the wind speed that you see in the category-5 hurricanes. And the force goes as the square of the wind speed, meaning the greatest tornadic winds pack four times the wallop.

In principle, we could design structures that would survive such power. In fact, armies did, in World War, to protect gun emplacements from artillery shells. They called them pillboxes. Much of the structure lay below grade. Any view was afforded through narrow slits. Glass? You’ve got to be kidding.

If tornadoes were ubiquitous, and present all the time, and if despite such continuous violence, the human race had developed to its present point, we would live this way. But each year’s tornado tracks cover only a small area. [Brace yourself, a bit of arithmetic coming up!] Sticking to round numbers, let’s say we have 1000 tornadoes a year here. Let’s go a little further, and figure that for each one, there’s a swath of damage maybe ¼ mile wide, but 4 miles long. This is probably an over-estimate for the smaller, more common ones; but an underestimate for the bigger ones. So that’s one square mile of damage for each tornado. For the whole year? Maybe 1000 square miles of damage. Picture that as a square roughly 30 miles on a side…and now compare that with the damage swath for a single hurricane, making landfall. There an area maybe 30-50 miles wide, maybe much more, is affected, and the storm penetrates many miles inland.

So the area likely to be damaged by all the tornadoes of a single year might be comparable to the area of property loss affected by a single hurricane.

Now meteorologists are no different from your doctor or your stock broker. We all say, “Past performance is no guarantee of future performance. Actual results may vary.” But you get the general idea.

Now let’s compare the 1000 square miles damaged each year, first with the 2 million square miles of U.S. land in tornado-risk areas. Any particular point (a home, or a building)? Maybe only one in 2000 is hit in any given year. That means (pointy-headed statistical alert) that your house has a 50-50 chance of being hit over a 1300-year period. And even if my back-of-the-envelope calculation is off by, say, a factor of four (quite likely, by the way), the risk of the tornado hitting your actual house might be as low as say once in 300 or so years. Seems like a long time, maybe.

And that’s why we really don’t design homes, even in tornado-prone areas, to survive a direct tornado hit. However, in Oklahoma, the odds of being hit are higher. And people in Oklahoma know this. What to do? One strategy? Tornado shelters, below ground, outside the home (remember Dorothy and the Wizard of Oz?). They look rather like those World War II pillboxes. Of course, it’s not that easy to install them properly in clay soils, and when unused, they become habitat for insects and snakes, etc. Some prefer a suggestion put forth by Ernst Kiesling, who proposed houses be constructed with an reinforced interior safe room.

Now let’s think about urban areas, versus individual homes. In the United States, about 2% of the land area is today considered urban. But this percentage has been growing at a rate of about 10% per decade – maybe doubling since the end of World War II. So back then, if one percent of the land was urban, maybe we could expect 10 tornadoes a year on average to hit heavily populated areas. But going forward, we can expect that figure to be 20 tornadoes or so. And, as we continue to concentrate our population, the chances for a truly catastrophic tornado event inexorably mount.

In a word: tornadoes hitting downtown areas in the past? Rare – almost unheard of. But tornadoes hitting downtown areas in the future? Increasingly common.

It’s time to start planning and building awareness of such risks, and developing plans. Evacuating urban buildings? Problematic. Opportunities to shelter-in-place? Minimal. To work through a strategy providing for safety in the face of this threat will require the best minds in both the private and public sector. And it ought to start now.

A final note on airports. These by themselves are an even small fraction of the national real estate than the urban areas in which they’re embedded, right? But on May 5th 1995 a line of thunderstorms producing softball-sized hail went over Dallas-Fort Worth Airport at a time when American Airlines was conducting a hub operation. The hail caught ten percent of American’s entire fleet on the ground. All those planes were grounded for several days while they were inspected. Some required considerable maintenance and were idled for a longer period. Just sixteen years later, another airport has been hit. An example – only one – of our increasing vulnerability to small-scale, violent weather.

Do you remember that old pencil-and-graph-paper game (or more current electronic versions) you played called “Battleship?”

You’d pencil in a fleet of warships at different locations on the gridded paper (say letters along one side and numbered across the page). You and your opponent would then take turns trying to “sink” the opponent’s ships, by guessing their coordinates and “shooting” at those locations? Once you’d hit every square occupied by a ship, that ship was sunk and could no longer fire back. Sink all your opponents ships and you won. [Maybe you had more of a life. But my brother and I would play this.]

Anyway, with our urbanized populations and critical infrastructure, we’re playing “Battleship” with the Earth on which we live.

Only in this Real World, it’s always Earth’s turn.

28 April 2011

Weather is Not Climate Unless People Die

UPDATE: At Dot Earth,Andy Revkin has collected a great set of expert perspectives on this event.

UPDATE #2: Subsequent to this post Gleick has added a new parenthetical to his post, that says the opposite of his first paragraph: "More extreme and violent climate is a direct consequence of human-caused climate change (whether or not we can determine if these particular tornado outbreaks were caused or worsened by climate change)." 

Peter Gleick is only the most recent climate scientist to try to exploit extreme weather for political gain, writing at The Huffington Post:
Violent tornadoes throughout the southeastern U.S. must be a front-page reminder that no matter how successful climate deniers are in confusing the public or delaying action on climate change in Congress or globally, the science is clear: Our climate is worsening. . .

In the climate community, we call this "loading the dice." Rolling loaded dice weighted toward more extreme and energetic weather means more death and destruction.
You can see in the graph above that there is no upwards trend in US tornado deaths, 1940-2010 (PDF).  This year's very active season and tragic loss of life won't alter that conclusion.  Actually there is a sharp downwards trend during a period when US population grew a great deal (consider this graph from Harold Brooks for a longer term perspective UPDATE: See below for this graph through 2010).  There is obviously no evidence of "more death and destruction."  On the lack of trends in destruction see this paper.

On the significance of yesterday's tragic tornado outbreak, consider this perspective from NOAA:
What's the risk of another super-outbreak like April 3-4, 1974? It's rare; but we don't know how rare, because an outbreak like that has only happened once since tornado records have been kept. There is no way to know if the odds are one in every 50 years, 10 years or 1,000 years, since we just do not have the long climatology of reasonably accurate tornado numbers to use. So the bigger the outbreaks, the less we can reliably judge their potential to recur.
Gleick's column is all the more ironic for this statement:
Climate deniers who have stymied action in Congress and confused the public -- like the tobacco industry did before them -- need to be held accountable for their systematic misrepresentation of the science, their misuse and falsification of data, and their trickery.
Obviously, it is not just climate deniers who are engaged in misrepresentation and trickery. Here is what Gleick wrote just a few months ago:
While the public may not fully understand the difference between climate and weather, or understand how the world could be warming while it's cold outside, most well-known climate deniers fully understand these distinctions -- they just choose to ignore them in order to make false arguments to and score points with the public and gullible policymakers. Cherry-picking selected data that supports a particular point (i.e., it's cold today), while hiding or ignoring more data that points in exactly the opposite direction (i.e., global average temperatures are rising), is bad science and it leads to bad policy. Just last week Glenn Beck pointed to a snowstorm in Minneapolis as proof that global warming isn't happening. He knows better, but his audience may not.
Well said Peter.

UPDATE: Here is a graph of US tornado deaths 1875-2010, data courtesy of Harold Brooks, NOAA (Thanks Harold!):

27 April 2011

NASA on Shuttle Costs

At USA Today Dan Vergano has an article taking a look back at the Space Shuttle program, which is scheduled to fly its last flight this year.  I chatted with Dan yesterday about the program and in particular the difference between the estimates that NASA provided to him of total program costs, $113.7 billion, and those that Rad Byerly and I recently published, $192 billion.

My first impression was that NASA did not adjust for inflation in their tabulation (what the students in my quantitative methods seminar this past term learned was a methodological no-no).  Dan did a follow up and this was indeed the case. Here is an excerpt from the email that Dan received from NASA Public Affairs, shared with Dan's permission (emphasis added):
The number I gave you is actual dollars (that is we added the dollar amount for each fiscal year and came up with a total). The author [Pielke] seems to have adjusted his numbers to 2010 dollars then added them up. Because we don't know how he computed his adjustment, we can't comment on how he arrived at his number.

We estimate the total cost of the program in 2010 dollars from FY1971 to FY2010 (which does not include STS-133, STS-134 or STS-135) would be about $209.1 billion.
I'm not sure what is in NASA's numbers that is not in ours, or how they computed the inflation adjustment, but we did our calculations conservatively, so I'm not surprised at the higher tabulation. However, this is the first time in the 20 years that I've been looking at this issue that NASA reports a higher number than we do -- the per-flight difference is small -- $1.6 billion per flight from NASA and $1.5 billion from us, so not a big difference, but it is nice to see a convergence of views.

Karen Clark on Catastrophe Models


Karen Clark, one of the founders of the catastrophe modeling industry, is interviewed by Insurance Journal in the podcast linked above. There is also an edited transcript here.

Here is an excerpt from the accompanying news story:
The need for insurers to understand catastrophe losses cannot be overestimated. Clark’s own research indicates that nearly 30 percent of every homeowner’s insurance premium dollar is going to fund catastrophes of all types.

“[T]he catastrophe losses don’t show any sign of slowing down or lessening in any way in the near future,” says Clark, who today heads her own consulting firm, Karen Clark & Co., in Boston.

While catastrophe losses themselves continue to grow, the catastrophe models have essentially stopped growing. While some of today’s modelers claim they have new scientific knowledge, Clark says that in many cases the changes are actually due to “scientific unknowledge“— which she defines as “the things that scientists don’t know.”
These comments are followed up in the interview:
Your concern is that insurers and rating agencies, regulators and a lot of people may be relying too heavily on these models. Is there something in particular that has occurred that makes you want to sound this warning, or is this an ongoing concern with these?

Clark: Well, the concern has been ongoing. But I think you’ve probably heard about the new RMS hurricane model that has recently come out. That new model release is certainly sending shockwaves throughout the industry and has heightened interest in what we are doing here and our messages…. [T]he new RMS model is leading to loss estimate changes of over 100 and even 200 percent for many companies, even in Florida. So this has had a huge impact on confidence in the model.

So this particular model update is a very vivid reminder of just how much uncertainty there is in the science underlying the model. It clearly illustrates our messages and the problems of model over reliance.

But don’t the models have to go where the numbers take them? If that is what is indicated, isn’t that what they should be recommending?

Clark: Well, the problem is the models have actually become over-specified. What that means is that we are trying to model things that we can’t even measure. The further problem with that is that these assumptions that we are trying to model, the loss estimates are highly sensitive to small changes in those assumptions. So there is a huge amount of uncertainty. So just even minor changes in these assumptions, can lead to large swings in the loss estimates. We simply don’t know what the right measures are for these assumptions. That’s what I meant… when I talked about unknowledge.

There are a lot of things that scientists don’t know and they can’t even measure them. Yet we are trying to put that in the model. So that’s really what dictates a lot of the volatility in the loss estimates, versus what we actually know, which is very much less than what we don’t know.
In the interview she recommends the use of benchmark metrics of model performance, highlights the important of understanding irreducible uncertainties and gives a nod toward the use of normalized disaster loss studies.  Deep in our archives you can find an example of a benchmarking study that might be of the sort that Clark is suggesting (here in PDF).

26 April 2011

Paranoid Style in Climate Politics

A colleague reminds me of this 1964 essay in Harper's by historian Richard Hofstadter, which I recall having encountered in grad school.  The essay was recently invoked by The Weekly Standard and according to Wikipedia, is frequently used in contemporary debates.  Perhaps too frequently. 

Even so, this excerpt reminded me of a style of argumentation that has become disturbingly prominent in contemporary climate debates:
The paranoid spokesman sees the fate of conspiracy in apocalyptic terms—he traffics in the birth and death of whole worlds, whole political orders, whole systems of human values. He is always manning the barricades of civilization. He constantly lives at a turning point. Like religious millenialists he expresses the anxiety of those who are living through the last days and he is sometimes disposed to set a date for the apocalypse. (“Time is running out,” said Welch in 1951. “Evidence is piling up on many sides and from many sources that October 1952 is the fatal month when Stalin will attack.”)
   
As a member of the avant-garde who is capable of perceiving the conspiracy before it is fully obvious to an as yet unaroused public, the paranoid is a militant leader. He does not see social conflict as something to be mediated and compromised, in the manner of the working politician. Since what is at stake is always a conflict between absolute good and absolute evil, what is necessary is not compromise but the will to fight things out to a finish. Since the enemy is thought of as being totally evil and totally unappeasable, he must be totally eliminated—if not from the world, at least from the theatre of operations to which the paranoid directs his attention. This demand for total triumph leads to the formulation of hopelessly unrealistic goals, and since these goals are not even remotely attainable, failure constantly heightens the paranoid’s sense of frustration. Even partial success leaves him with the same feeling of powerlessness with which he began, and this in turn only strengthens his awareness of the vast and terrifying quality of the enemy he opposes.
   
The enemy is clearly delineated: he is a perfect model of malice, a kind of amoral superman—sinister, ubiquitous, powerful, cruel, sensual, luxury-loving. Unlike the rest of us, the enemy is not caught in the toils of the vast mechanism of history, himself a victim of his past, his desires, his limitations. He wills, indeed he manufactures, the mechanism of history, or tries to deflect the normal course of history in an evil way. He makes crises, starts runs on banks, causes depressions, manufactures disasters, and then enjoys and profits from the misery he has produced. The paranoid’s interpretation of history is distinctly personal: decisive events are not taken as part of the stream of history, but as the consequences of someone’s will. Very often the enemy is held to possess some especially effective source of power: he controls the press; he has unlimited funds . . .
The essay was originally written in reference to the American far-right, but obviously has more general applicability.

23 April 2011

The Technological Egg and Regulatory Chicken

A few bloggers (e.g., here and here and here) have rediscovered an interest in the case of ozone depletion, apparently via a desire to try to impeach Matt Nisbet's new report in any whichway possible. Nisbet makes a very brief reference to The Hartwell Paper's discussion of ozone policy. The lessons of ozone depletion policy are well worth understanding, so this post is a useful follow up.

In The Hartwell Paper (PDF) the only discussion of the lessons of ozone depletion policy were to characterize it as a "tame" problem amendable to a technological fix, as compared to climate change with is a classically "wicked" problem:
Originally described by Rittel and Webber in the context of urban planning, ‘wicked’ problems are issues that are often formulated as if they are susceptible to solutions when in fact they are not.28 Technical knowledge was taken as sufficient basis from which to derive Kyoto’s policy, whereas ‘wicked’ problems demand profound understanding of their integration in social systems, their irreducibly complexity and intractable nature.. .

The consequence of this misunderstanding was that there was a fundamental framing error, and climate change was represented as a conventional environmental ‘problem’ that is capable of being ‘solved’. It is neither of these. 
The key here is that ozone depletion was a "tame" problem in the sense that it was not only amenable to being solved, but being solved via a technological fix.  Climate change by contrast has more in common with issues such as poverty and conflict, in that such problems cannot be solved once and for all -- but we can do more or less well on them -- and, we argue, they certainly cannot be "solved" via a simple technological fix.

OK, back to the claims now being advanced about ozone.  Apparently some people want to believe that regulation in the case of ozone depletion caused technologies to emerge, almost instantaneously, and that from this lesson we should take (apparently) something relevant for climate policy, perhaps that regulation (if only we could pass it) would also spring forth an immediate technological fix.

The Hartwell Paper formulation is being challenged in current blog discussion by Edward Parson, a professor at Michigan, who wants to take issue with the idea that technological alternatives to CFCs helped to move along CFC regulations. He writes that it was science, and science alone, that led to regulation: "this was ALL about responding to scientific evidence for the risk, not a bit about availability of [CFC] alternatives."

It is my view that Parson claims are way overstated. Consider that DuPont, the world's major producer of CFCs at the time with 25% market share had patented a process for manufacturing HFC-134a (the leading CFC alternative) in 1980 after identifying it as a replacement to Freon in 1976 and had applied for more than 20 patents for CFC alternatives immediately before and after the signing of the Montreal Protocol. Du Pont saw alternatives as a business opportunity, e.g., its Freon division head explained in 1988: "''there is an opportunity for a billion-pound market out there.'' The fact that ozone regulations  focused on production, and not consumption, meant that there would still be a market for conventional CFCS into the 1990s, slowing down the deployment of substitutes, and making the transition easier for industry. Du Pont's decision to back regulation was motivated more by economic opportunity -- an opportunity that existed solely because of substitutes -- rather than exclusively about scientific arguments, though there is no doubt that the science played a role in the process.

In The Climate Fix (pp. 26-28) I argue that incremental technological advances on CFC alternatives (really starting in the 1970s) helped to grease the skids for incremental policy action creating a virtuous circle that began long before Montreal and continued long after (see this paper in PDF for a more in depth discussion).  In her excellent book on ozone depletion policy, Ozone Discourses, Karen Litfin explains (p. 95):
The issue resembles a chicken-and-egg situation: without regulation there could be no substitutes but, at least in the minds of many, without the promise of substitutes there could be no regulation.
This is indeed very much my view. It seems fairly obvious that the ease of deploying technological fixes can help to make it easier for regulations to be put into place, and the history of environmental (and other) regulations bears this out.

But for the purposes of discussion, let's take Parson's view as if it were  true. He explains:
The crucial technological advances that demonstrated the viability of alternatives all came after, not before, the political decision to impose 50% CFC cuts -- and the effort to generate these advances was motivated by the imminent threat of these regulatory restrictions -- not the reverse.
If we are to believe this, then we must also conclude that the chemical industry, notably DuPont, started working on technological substitutes no earlier than 1986 and within 3 years had not only demonstrated their viability, but had done so in a manner that began to allow rapid deployment displacing conventional CFCs (see figure at the top of this post).  If this was the case then the ozone issue was even more tame than we have argued in The Hartwell Paper -- it was in effect technologically trivial.  If Parson's history is correct then it leads one to conclude that the ozone case is even less relevant to climate change than we have argued (unless one wants to advance the fantasy that decarbonization of our economy is technologically trivial, awaiting only the regulatory magic wand).

Parson does not explain explicitly why he thinks his revisionist history matters in the context of climate change, but presumably it is because he and others want to believe that (a) a battle for climate regulations should be waged through arguments over science, and (b) that winning such battles over regulations can make technologies magically appear after those regulations are committed to by governments.

I write in The Climate Fix that if the ease of technological substitution for our fossil fuel-based energy system was as easy as it was on ozone that we wouldn't be debating the issue as it would have been solved already.  No amount of revisionist history can change the magnitude of the challenge of decarbonizing the global economy.  One lesson that I take from the ozone history for climate change policy is that pricing or regulating carbon, through whatever policy mechanisms, will be far more possible politically to the degree that technological substitutes are effective on performance and price.

UPDATE: Here is Richard Benedick, chief US negotiator, on the lessons of the ozone experience as related to climate change (emphasis added):
It is worth recalling that the 1987 Montreal Protocol on Substances That Deplete the Ozone Layer, later characterized by the heads of the UN Environment Program and the World Meteorological Organization as “one of the great international achievements of the century,” was negotiated by only about 30 nations in nine months, with delegations seldom exceeding six persons and with minimal attention from outside observers and media. I doubt whether the ozone treaty could have been achieved under the currently fashionable global format.

We might draw some useful lessons from the ozone history. In the late 1970s, the ozone science was actually much more disputed than the climate science of today, and the major countries that produced and consumed chlorofluorocarbons (CFCs) were hopelessly deadlocked over the necessity for any controls at all. In this situation, the first international action on protecting the ozone layer was neither global, nor even a treaty. Rather, it was an informal accord among a loose coalition of like-minded nations, including Australia, Canada, Norway, Sweden, and the United States, to individually and separately ban the use of CFCs in aerosol spray cans.

This measure alone resulted in a temporary 30% drop in global CFC consumption (temporary because these “wonder chemicals” were continuing to find new uses in numerous industries.) But the action was nevertheless significant for the future. The resultant technological innovations demonstrated to the skeptics (in this case the European Community, Japan, and the Soviet Union) that controls were feasible, at least for this class of products. It also gave the United States and other proponents of a strong treaty the moral and practical high ground in later negotiations to restrict all uses of CFCs. Yet, if anyone had actually proposed a 30% reduction target, it would surely have been rejected as impossible.

An important lesson here is that a specific policy measure, not an abstract target, could stimulate unanticipated technological innovation. The policy measure drove the agreement on targets in the later ozone protocol, not vice versa. In contrast, the half-hearted performance of most governments with respect to climate policy measures has not matched their political rhetoric about the urgency of targets.

Another important lesson from the Montreal history was that not all countries need to agree in order to take a substantial step forward. It is also relevant to note that, in contrast to Kyoto, developing nations did accept limitations on their CFC consumption, but only when they were assured of equitable access to new technologies. Technology development is the missing guest at the Kyoto feast. . .

Friday Funny - Neural Circuits

Chris Mooney explains the biological mechanisms that have led experts to be able to protect their minds against the corrosive effects of ideology and politics:
I’m not saying anyone is capable of being 100 percent unbiased but I am saying that scientists evaluate scientific claims, and also claims about expertise, using the norms of their profession, precisely because they have neural circuits for doing so laid down by many years of experience. Which the other groups don’t have.
So when it is revealed that many scientists have partisan and ideological leanings this is not a function of their biases, but rather a reflection of truth. This is quite different than arguing that "Liberals have a reality bias."  You can follow the logic from there.

21 April 2011

Analysis of the Nisbet Report -- Part II, Political Views of Scientists

One part of Matthew Nisbet's recent report that has received very little attention is its comparative analysis of ideological and partisan perspectives of members of the American Association for the Advancement of Science.

Nisbet shows that AAAS members are extremely partisan and ideological.  The word "extremely" is mine, and what do I mean by it?  Look at the figure above:  AAAS members are more partisan than MSNBC viewers and even Tea Party members.  AAAS members are more ideological than evangelical churchgoers but less so than Fox News viewers.  In both cases AAAS members are very different than the public as a whole.

Dan Sarewitz has discussed the problems with the ideological and partisan likemindedness of our scientific community, which has been exploited and reenforced in political debates:
During the Bush administration, Democrats discovered that they could score political points by accusing Bush of being anti-science. In the process, they seem to have convinced themselves that they are the keepers of the Enlightenment spirit, and that those who disagree with them on issues like climate change are fundamentally irrational. Meanwhile, many Republicans have come to believe that mainstream science is corrupted by ideology and amounts to no more than politics by another name. Attracted to fringe scientists like the small and vocal group of climate skeptics, Republicans appear to be alienated from a mainstream scientific community that by and large doesn't share their political beliefs. The climate debacle is only the most conspicuous example of these debilitating tendencies, which play out in issues as diverse as nuclear waste disposal, protection of endangered species, and regulation of pharmaceuticals.

How would a more politically diverse scientific community improve this situation? First, it could foster greater confidence among Republican politicians about the legitimacy of mainstream science. Second, it would cultivate more informed, creative, and challenging debates about the policy implications of scientific knowledge. This could help keep difficult problems like climate change from getting prematurely straitjacketed by ideology. A more politically diverse scientific community would, overall, support a healthier relationship between science and politics.
It should come as no surprise that the increasing politicization of science has come to make science more political rather than politics more scientific.  At the same time, the more partisan and/or and ideological that you are, the more welcome and comfortable that you will find the politicization of science, as it reenforces your preconceptions.

It also fits perfectly into a political strategy that holds that arguments about science can help to resolve political debates.  Climate change is only the most visible of this tendency, where the empirical evidence shows that efforts to wage climate politics through climate science have had the greatest effect in magnifying the partisan divide.  Some are blinded by these dynamics -- for instance Chris Mooney excuses the extreme partisanship/ideology of AAAS members by blaming  . . . George W. Bush.

Anyone concerned with political decision making in a society that contains a diversity of partisan and ideological perspectives should be concerned that, in one sector at least, the experts that we rely on have views that are far different than the broader society.  One response to this would be to wage a political battle to try to convert the broader society to the values of the experts, perhaps through the idea that improving science communication or education a great value transformation will occur.

My sense is that this strategy is not just doomed to fail, but will have some serious blowback effects on the scientific community itself.  More likely from my view is that such efforts to transform society through science will instead lead to the partisan debates across society taking firmer root within our expert communities. This is a topic that deserves more discussion and debate.  Dan Sarewitz concludes provocatively that, "A democratic society needs Republican scientists."

It is important to recognize that hyper-partisans like Joe Romm and Chris Mooney will continue to seek to poison the wells of discussion within the scientific community (which is left-leaning, so this is a discuss that needs to occur at least to start within the left) through constant appeals to partisanship and ideology.  Improving the role of science and scientists in our political debates will require an ability to rise above such efforts to associate the scientific community with only a subset of partisan and ideological perspectives.  But science and expertise belongs to all of us, and should make society better as a whole.

Politicians Who Fail to Understand Policymaking

I have a new column up at Bridges, and it is a bit more hard hitting than my usual quarterly perspective.  In it I explain that we should be a bit forgiving when politicians don't have the same level of knowledge as experts, as they can't be experts in everything.  However, we should be far less forgiving when politicians show that they don't understand the mechanisms of policy.  Here is an excerpt:
[I]t should be far less worrisome that the public or policy makers do not understand this or that information that experts may know well. What should be of more concern is that policy makers appear to lack an understanding of how they can tap into expertise to inform decision making. This situation is akin to flying blind.

Specialized expertise typically does not compel particular decisions, but it does help to make decisions more informed. This distinction lies behind Winston Churchill's oft-cited advice that science should be "on tap, but not on top." Effective governance does not depend upon philosopher kings in governments or in the populace, but rather on the use of effective mechanisms for bringing expertise into the political process.

It is the responsibility - even the special expertise - of policy makers to know how to use the instruments of government to bring experts into the process of governance.
Read the whole thing here and feel free to come back and discuss, debate or challenge.

19 April 2011

Analysis of The Nisbet Report -- Part I, Doomed to Fail

Matt Nisbet of American University has issued a report that includes some fascinating information and compelling analysis of a number of issues related to the US environmental community, the media and scientists as related to climate change.  I was a reviewer of the report, which means that I read an early version and provided some critical comments to Matt which he considered (or not) in the revision.

With a few posts, starting with this one, I want to provide a brief discussion of the top line empirical findings in the Nisbet report and why I think that they are important.

Top line finding number 1
The environmental community spends a truckload of money on a strategy doomed to fail

While horse race aficionados will continue to focus attention on the minutia of financial accounting in order to argue about who spent more on the cap and trade battle, the good guys or the bad guys, far more interesting is data gathered which documents the sheer magnitude of expenditure by the environmental community in support of a specific approach to climate policy.

The data show quite clearly, no matter how one may try to parse it, that the debate over climate change is not David versus Goliath, but rather two Goliaths slugging it out in high stakes, big money power politics.  According to data gathered by Nisbet, the leading environmental advocacy foundations spent upwards of half a billion dollars from 2008-2010, largely in support of a collective strategy expressed in the little-known but crucially important Design to Win strategy document for environmental philanthropy (here in PDF):
Approximately $368 million was distributed across the 1,246 individual grants. However, given that not all foundation records are publicly available for this period, the total of $368 million likely underestimates the actual amount distributed between 2008 and 2010. If an average based on a foundation’s previous year giving is used as a stand-in for missing years, these nine foundations would have distributed more than $560 million between 2008 and 2010.
The Design to Win strategy is encapsulated in the following figure which helps to explain what the mainstream environmental community thinks it has been doing on the climate issue over recent years.  It also explains what you see many climate bloggers doing which often includes wringing their hands over what they believe to be a lack of education among the public, the media and the politicians.
The figure shows that the theory at work here is that "education" of the public and the media is expected to "push decision makers" which "creates context for new policy" and along with "education" of decision maker "enables new policy" to drive "massive change" in investments.  This framework is one important reason why we have the climate wars, as it shows that the environmental community believes that in winning a war for public opinion lies the key to policy success.

Yes, of course there are conservative and other groups opposed to action on climate change who also have truckloads of resources at their disposal and who also think they are fighting a battle over public opinion.  But we know this, as it has been well documented and discussed for years, with many such studies cited by Nisbet in his bibliography. What Nisbet's report does is help to fill a notable gap in the study of climate policy and that is to document the efforts devoted by those in support of action. What his analysis clearly shows is that the environmental community has enormous resources at their disposal which are clearly focused on a strategy of public education.

While the underlying strategy adopted by the US environmental community may be titled "Design to Win" it is in fact "Doomed to Fail." As I document extensively in The Climate Fix, the battle for public opinion on climate change has long been over and the environmental community has won, as a majority of people consistently believe that humans affect the environment and are in support of action.  Further, the idea that public education leads to new policy is deeply flawed political science  and has been routinely debunked in the science studies literature as the so-called "deficit model." More fundamentally, it is simply contrary to history, experience and plain old common sense.

The following data from Gallup shows little change in aggregate public opinion over many years. Such data is fairly representative of public opinion polling over the past decade and longer on public views of climate science and action (see The Climate Fix for more detail).
And while public understanding, concern about and support for action have been largely stable (albeit with various ups and downs) for decades, the most notable shift has been the dramatic partisan divide that has opened up on the issue.
“Of those who identify as Republicans, about 49 percent said in the 2001 Gallup survey that they believe the effects of global warming have already begun — a number that dropped to 29 percent in 2010,” states a summary of Associate Professor Aaron McCright’s study, which appears in the Spring issue of Sociological Quarterly.

“Meanwhile, the percentage of Democrats who believe global warming has already begun increased from about 60 in 2001 to 70 in 2010. All told, the gap between these ‘believers’ in the two parties increased from 11 percent in 2001 to 41 percent in 2010,”
Of course, one could look at this data and conclude that environmental groups simply haven't done enough education of the public or that the forces of darkness are still in the lead, as measured by spending, so more spending on "education" is needed.  It is certainly a convenient argument to advance if you are in the business of trying to "educate" the public, especially if you are a recipient of foundation funding under the Design to Win strategy.  It is also insane to expect to continue the same behavior and to see different results.

But what if the Design to Win strategy is in fact fanning the partisan divide and as a consequence making action less likely?  What if the "education" strategy has morphed into destructive efforts to silence or discredit alternative voices in the climate debate other than those which espouse the narrow set of policy prescriptions endorsed by Design to Win?  What if the entire theory behind the idea of rectifying a deficit of public understanding is based on flawed premises?

Nisbet suggests that there are consequences:
[F]ocus and strategy are only as effective as the premises upon which they are based. As described, the Design to Win report appeared to define climate change in conventional terms, as an environmental problem that required only the mobilization of market incentives and public will. With this definition, comparatively limited funding was directed toward fostering the role of government in promoting new technology and innovation. Nor was there equivalent investment in such important human dimensions of the issue as adaptation, health, equity, justice or economic development.

The Two Italys

The FT's fascinating discussion of economic differences between northern and southern Italy suggests that the different regional approach to innovation helps to explain the almost 100% difference in per capita GDP (chart above).  The north is linked to Germany and sees innovation as central, whereas the south is more state oriented:
Marco Fortis, an Italian economist, says the fate of tens of thousands of entrepreneurs scattered across north and central Italy has for decades been linked to the development of German industry. The small and medium-sized companies that are the backbone of the Italian economy specialise in the manufacture of premium finished products that are bought by German multinationals for assembling high-end goods, be it cars or washing machines.

The fruits of this relationship are reflected in the similar levels of wealth creation in Germany and north and central Italy.

Despite Italy’s sluggish growth of about 1 per cent a year for the past decade, per capita gross domestic product in north and central Italy is more or less equal to that of Germany as a whole. But in Italy’s south GDP per capita is below Portugal, and among the lowest in Europe.

While entrepreneurs in the north and the centre of Italy imitate Germany, industry in the south is far distant, both geographically and culturally. Although there are examples of entrepreneurism in the south, such as wine growers in Sicily, most industry is traditionally controlled and financed by the state.

Carbon Regulation Default Swaps

The FT provides a small window into financial innovation related to carbon trading with an article describing a new financial product that is intended to allow carbon traders to hedge or even speculate against regulatory changes in the carbon market:
Kiln, a unit of Japan’s Tokio Marine that is one of the leading Lloyd’s of London underwriters, and specialist underwriter Parhelion, have jointly created a policy for an unnamed bank to insure its options on future Certified Emissions Reductions. The credits are issued under the Kyoto protocol to projects that cut greenhouse gases.

Underwriters hope the new policy will act as a safety net and encourage traders to remain active and provide liquidity.

The policy was designed for the bank in response to the move by the European Union’s Climate Change Committee to ban trading in credits earned from plants that destroyed two sources of greenhouse gases – HFC-23, a byproduct of refrigerant manufacturing, and adipic acid.

Julian Richardson, chief executive of Parhelion, said that while policy development under the Kyoto Clean Development Mechanism had settled down, EU policy on the Emissions Trading Scheme was a moveable feast and that this policy uncertainty was discouraging investors.

“Because this market exists purely through regulation, banks are faced with a lot of regulatory risk,” he said. “The EU decided only late last year that these two types of project no longer qualified.”
Hmmm ... a new financial product that allows speculators to win and lose according to future governmental decisions in a market that exists only because of regulation.  Does any one else see some problems here?  A "moveable feast" indeed.

The IPCC's Proposed COI Policy

The Intergovernmental Panel on Climate Change has put forward a surprisingly rigorous proposal for handling actual and perceived conflicts of interest (PDF, background paper here in PDF).  The proposal has a few notable weak spots in its plans for adjudication and disclosure, but represents a serious effort to fix this notable gap in IPCC procedures. If implemented as written the proposal will likely lead to a considerable degree of turnover in the existing IPCC leadership, authors and staff, due to the rigor of the policy but also the requirement for disclosure that it mandates.

The proposal defines a conflict of interest as follows:
A “conflict of interest” refers to any current financial or other interest which could: i) significantly impair, or could appear to impair, the individual’s objectivity in carrying out his or her duties and responsibilities for the IPCC, or ii) create an unfair advantage, or appear to create an unfair advantage, for any person or organization. For the purposes of this policy, the appearance of a conflict of interest - an “apparent conflict of interest” - is one in which circumstances could lead a reasonable person to question an individual’s objectivity or question whether an unfair advantage has been created.

The conflict of interest requirements in this policy are not designed to include an assessment of one's actual behaviour or character, one's ability to act objectively despite the conflicting interest, or one's relative insensitivity to particular amounts of specific assets because of one's personal wealth. The requirements are designed to eliminate situations involving real or apparent conflicts of interest, and thereby to protect the individual, the organization, and the public interest. Those contributing to IPCC products should not be placed in a situation where others could reasonably question, and perhaps discount or dismiss, the work of the IPCC simply because of the existence, or the apparent existence, of conflicting interests.
The proposal focuses on the following individuals:
the IPCC Chair, Vice Chairs, Working Group and Task Force Co-chairs and other members of the IPCC Bureau), authors with responsibilities for report content (Coordinating Lead Authors, Lead Authors and Review Editors) and the staff of the Technical Support Units (TSU’s)
The proposed policy requires a considerable degree of disclosure:
In ascertaining the possible presence of a conflict of interest, the following kinds of financial interests will be disclosed and reviewed: employment relationships (including private and public sector employment and self-employment); consulting relationships (including working in commercial or professional consulting or service arrangements, serving on scientific and technical advisory boards, serving as an expert witness in litigation, and providing services in exchange for honorariums and expense reimbursement); directorships; stocks, bonds, and other financial instruments and investments including partnerships; real estate investments; patents, copyrights, and other intellectual property interests; commercial business ownership and investment interests; research funding and other forms of research support.

All significant and relevant non-financial interests should be disclosed. These include any associations with organisations with an interest in the topic of the IPCC report or product to which the individual is contributing. These may include government advisory committees, nondepartmental public bodies, charities or non-governmental organisations. Such associations are not necessarily incompatible with participation in IPCC, but it is important that they are disclosed. All interests that might undermine the credibility of the IPCC report or product if they were made public during or after its preparation should be disclosed.
Here the proposal gets a little opaque.  The disclosed information will be considered by a newly appointed committee which will render judgments on the ability of the individual to serve on the IPCC.  This committee will apparently work in secret and its deliberations will not be made public, and nor will the disclosures.  This is problematic.  The IPCC will not solve its transparency problems by creating a star chamber deeper in its bureaucracy.

In 2009 the Bipartisan Policy Center issued a report on scientific committee empanelment (PDF here, on which I served) and explicitly addressed this issue.  We sought to maintain a balance between the need for public transparency and the privacy of individuals.  Here is what we said in that report:
Not only the government, but also the public needs more information to determine whether a conflict or bias exists and has been appropriately handled. To build public trust through transparency, much more information on federal advisory committee members needs to be available than is now the case.

Obviously, a balance must be struck between the value of public information and privacy concerns. And public disclosure must not be so extensive that it greatly reduces the number of scientists willing to serve on committees. Federal agencies should monitor whether new requirements are making it harder to attract committee members. But disclosure is becoming more routine – in scientific journals and at universities, for example – and the government should not be a last bastion of secrecy.

One possibility would be for federal agencies to make publicly available all the information on a panelist’s disclosure form except the precise dollar amounts of their stock holdings or compensation and any information on the finances of their spouse or dependent children. At the same time, the agency would disclose the member’s educational background and scientific credentials. Ideally, all of this information would be released when committee members’ names were put up on the Web for public comment.
A second weak spot  in the proposal is that it relies entirely on self-disclosure.  The policies have no contingency for false, incomplete or misleading reporting.  This creates risks for both the organization and its participants.  Because the disclosures are not released in any manner to the public, there is every incentive for participants to under-report potential conflicts.

The IPCC can deal with this in part by publishing a public version of the the disclosure forms (e.g., without specific financial information but with institutional relationships intact).  At a minimum, participants should be required to attest that their self-disclosure is accurate and if it is determined to be inaccurate, this would disqualify the individual from participating in the IPCC.  Absent some sanction for filing a misleading or incorrect disclosure form, the procedure risks becoming a formality with little meaning.  Having served on NRC committees in the past I have seen unsanctioned self-disclosure in action.

Finally, the IPCC is silent on what to do with current participants who are in violation of the guidelines for conflicts of interest in this proposal.  As might be expected of an organization that has operated for several decades with no COI policies, such conflicts are rife in the organization, including its leader Rajendra Pachauri who has a wheelbarrow full of actual and potential conflicts.  How will the proposal be phased in?

The IPCC should be commended for taking the recommendation of the InterAcademy Council seriously.  We should be encouraged by the IPCC's initial steps forward on this issue, and the organization should be encouraged to keeping moving in the right direction.  At the same time, its work on this topic is not done yet, it has only just begun.

18 April 2011

A Decrease in Floods Around the World?

A new analysis of floods around the world has been called to my attention.  The new analysis is contrary to conventional wisdom but consistent with the scientific literature on global trends in peak streamflows.  Is it possible that floods are not increasing or even in decline while most people have come to believe the opposite?

Bouziotas et al. presented a paper at the EGU a few weeks ago (PDF) and concluded:
Analysis of trends and of aggregated time series on climatic (30-year) scale does not indicate consistent trends worldwide. Despite common perception, in general, the detected trends are more negative (less intense floods in most recent years) than positive. Similarly, Svensson et al. (2005) and Di Baldassarre et al. (2010) did not find systematical change neither in flood increasing or decreasing numbers nor change in flood magnitudes in their analysis.
This finding is largely consistent with Kundzewicz et al. (2005) who find:
Out of more than a thousand long time series made available by the Global Runoff Data Centre (GRDC) in Koblenz, Germany, a worldwide data set consisting of 195 long series of daily mean flow records was selected, based on such criteria as length of series, currency, lack of gaps and missing values, adequate geographical distribution, and priority to smaller catchments. The analysis of annual maximum flows does not support the hypothesis of ubiquitous growth of high flows. Although 27 cases of strong, statistically significant increase were identified by the Mann-Kendall test, there are 31 decreases as well, and most (137) time series do not show any significant changes (at the 10% level). Caution is advised in interpreting these results as flooding is a complex phenomenon, caused by a number of factors that can be associated with local, regional, and hemispheric climatic processes. Moreover, river flow has strong natural variability and exhibits long-term persistence which can confound the results of trend and significance tests.
They conclude (emphasis added):
Destructive floods observed in the last decade all over the world have led to record high material damage. The conventional belief is that the increasing cost of floods is associated with increasing human development on flood plains (Pielke & Downton, 2000). However, the question remains as to whether or not the frequency and/or magnitude of flooding is also increasing and, if so, whether it is in response to climate variability and change.

Several scenarios of future climate indicate a likelihood of increased intense precipitation and flood hazard. However, observations to date provide no conclusive and general proof as to how climate change affects flood behaviour.
References:

Bouziotas, D., G. Deskos, N. Mastrantonas, D. Tsaknias, G. Vangelidis, S.M. Papalexiou, and D. Koutsoyiannis, Long-term properties of annual maximum daily river discharge worldwide, European Geosciences Union General Assembly 2011, Geophysical Research Abstracts, Vol. 13, Vienna, EGU2011-1439, European Geosciences Union, 2011.

Kundzewicz, Z.W., D. Graczyk, T. Maurer, I. PrzymusiƄska, M. Radziejewski, C. Svensson and M. Szwed, 2005(a):Trend detection in river flow time-series: 1. annual maximum flow. Hydrol. Sci. J., 50(5): 797-810.

15 April 2011

The FA Work Permit: A Positive or Negative?

The Professional Football Players Observatory has issued their first Global Player Migration Report, and it has some really interesting data on the flow of footballers between associations around the world.  For instance, here is a figure showing the importation of players into the EPL in 2010:
The text that accompanies the figure explains:
In 2010, Premier League clubs have imported players from 27 national associations. The geography of these imports shows the importance of spatial proximity. The majority of international signings have been carried out from national associations of EU's member states, for the most part neighbouring ones.

This situation is partially due to the ruling obliging non-EU footballers to have played 3/4 of the matches of their national A-team during the two years preceding the transfer to be eligible for a work permit. Insofar as footballers playing for local clubs have almost no chance of being selected in many of the most competitive non-European national teams, direct imports from there are often impossible.
The details on FA work permits for non-EU players can be found here. It presents a considerable obstacle to gaining entry to the EPL.  For instance, Colorado Rapids player Omar Cummings was recently denied a work permit because his national team, Jamaica, is not in the world's top 70 ranked nations (another of the work permit regulations):
The 28-year-old Jamaican traveled to Birmingham, England, in December to begin training with the English Premier League team. The Rapids' official site says Cummings' work permit was denied because he is not a member of the Premier League and his country has not been in the top 70, on average, in the FIFA World Rankings over the past two years.
The work permit raises some interesting questions about the effects of the work permit that might be addressed empirically, such as:

1. Does the work permit hurt the quality of EPL teams?

On the one hand, the performance of the EPL in the UEFA Champions League would tend to support the argument that at the top, the answer is "no."

On the other hand, the dominance of a handful of teams at the top of the league year after year suggests that the league has some problems with parity.

2. Does the work permit make the EPL more globally attractive?

I think that the answer here must be "yes."  By allowing in only those non-EU players who excel in international competitions for their national teams, the EPL guarantees that they are bringing in widely-recognized national stars who will command an audience from outside Europe.

3. Does the work permit help to aid the development of English players?

I am not sure how this might be measured, but I can come up with arguments on both sides.  My instincts say "no" and I wouldn't be surprised if the work permit actually allows room for weaker English players into the EPL than would make it in a fully open market.

Ironically enough the FA Work Permit is a government-mandated performance standard that bears some relationship to ongoing discussions on this blog about performance standards in technology, such as for light bulbs and fish hooks.  An evaluation of the effectiveness of the work permit policy will be a function of the goals that it seeks to achieve.  If the goals are to improve the quality of soccer in the EPL, enhance parity or develop the English national team, I'd hypothesize that the policy is not succeeding.  However, if the goal is to help give the EPL a competitive advantage in the international business of football (e.g., via brands, marketing, merchandising, TV rights, etc.) then it may indeed be a success.

I'd be interested in pointers to any academic or industry analyses of this issue.  Thanks!

14 April 2011

What Prompted the Decline of Oil Power?

The figure above comes from the IMF World Economic Outlook released earlier this week in a chapter on "oil scarcity" (PDF).  The report explains the figure as follows:
Most OECD countries saw a big switch away from oil in electric power generation in the early 1980s. After oil prices rose sharply compared with the prices of other fossil fuels in the 1970s, the power sector switched from oil to other input (Figure 3.6): some countries went back to coal (for example, the United States); others increased their nuclear capacity (for example, France) or turned to alternative energy sources.
Over about 40 years oil lost about 90% of its role as a source of energy for electricity production (from a 25% share to a 2.5% share).  There are a few interesting points to take from this dramatic shift, some of which seem obvious but nonetheless worth highlighting.

1. Significant energy shifts happen.
2. They can take many decades.
3. Such shifts depend upon available substitutes.
4. The trend was from more expensive energy to less expensive energy, not vice versa.

There is a lot of material in the IMF report that will be worth a future discussion as well.

13 April 2011

Weak Hooks and Dim Bulbs

Rand Paul's invocation of Ayn Rand yesterday to complain about government technology standards for light bulb performance gives me an opportunity to raise a few questions.

If the government can mandate technology performance standards for fishing hooks, then why not light bulbs, or any other technology for that matter?  I can understand that people may wish to debate the substance of technology standards or whether they are necessary in particular cases.  What I don't get are arguments like Paul's that suggest that such standard setting is somehow illegitimate. There is really no legal or policy basis for such views, and while espousing such views may appeal to populist sentiment, they actually undercut the ability of the government to govern, which I suppose may be the point.

And this makes me wonder, why aren't those who complain about light bulbs on the warpath against "weak hooks"?

Energy Supply Implications of Japan's Nuclear Crisis

At a meeting last week I was treated to an overview of Japan's energy situation in the aftermath of the Tohuku earthquake and tsunami by Yuhji Matsuo, of the Institute of Energy Economics, Japan.  He shared the slide above, which is shared here with his permission. The table on the left shows the electricity supply knocked out by the quake. The fiigure on the right shows peak supply available (red dashed line) showing when capacity is expected to return, and needed peak supply (blue bars, based on 2010 consumption).   The figure was accompanied by this text:
‐ After the earthquake, many power plants (nuclear, LNG, coal, etc.) stopped operation and 27.1 GW generating capacity was lost.
‐ As of Mar 17th, TEPCO‘s electricity supply capacity was 33.5 GW, while estimated peak demand in March was 52GW. TEPCO and started scheduled blackouts in the eastern part Japan for the first time after the World War II.
‐ Blackouts will end in April, and start again in summer, when air cooling demand will boost electricity consumption.
‐ Great efforts have been paid in these regions (including Tokyo) to reduce electricity consumption.
I found the following to be interesting as well:

1. A total of 4 nuclear complexes felt the effects of the earthquake and the tsunami, including flooding.  Of those, 3 survived with no problems.  Obviously, a 75% success rate is not good enough, but it does indicate that reactor complexes can be designed to successfully survive such an event.

2. Far more fossil energy was knocked out by the event than nuclear energy. Another expert from Japan mentioned (somewhat wryly, I'd say) that this might mean that Japan hits its Kyoto emissions reduction targets.  Over the long term, to the extent that the restored energy is more carbon intensive than what it is replacing (which seems inevitable), Japan's carbon dioxide emissions will necessarily increase to more than the pre-earthquake levels.

The IEEJ has much more interesting information and analyses available here.  Thanks Yuhji!

12 April 2011

Richard Muller on NPR: Don't Play With the Peer Review System

I haven't really been paying much attention to the various issues associated with the Berkeley Earth project run by Richard Muller.  I expect that the peer-reviewed literature will sort it out eventually.  Whatever the ultimate scientific merit of that project, these comments of his on NPR yesterday are worth thinking about, and spot on in my view:
CONAN: Do you find that, though, there is a lot of ideology in this business?

Prof. MULLER: Well, I think what's happened is that many scientists have gotten so concerned about global warming, correctly concerned I mean they look at it and they draw a conclusion, and then they're worried that the public has not been concerned, and so they become advocates. And at that point, it's unfortunate, I feel that they're not trusting the public. They're not presenting the science to the public. They're presenting only that aspect to the science that will convince the public. That's not the way science works. And because they don't trust the public, in the end the public doesn't trust them. And the saddest thing from this, I think, is a loss of credibility of scientists because so many of them have become advocates.

CONAN: And that's, you would say, would be at the heart of the so-called Climategate story, where emails from some scientists seemed to be working to prevent the work of other scientists from appearing in peer-reviewed journals.

Prof. MULLER: That really shook me up when I learned about that. I think that Climategate is a very unfortunate thing that happened, that the scientists who were involved in that, from what I've read, didn't trust the public, didn't even trust the scientific public. They were not showing the discordant data. That's something that - as a scientist I was trained you always have to show the negative data, the data that disagrees with you, and then make the case that your case is stronger. And they were hiding the data, and a whole discussion of suppressing publications, I thought, was really unfortunate. It was not at a high point for science

And I really get even more upset when some other people say, oh, science is just a human activity. This is the way it happens. You have to recognize, these are people. No, no, no, no. These are not scientific standards. You don't hide the data. You don't play with the peer review system.

11 April 2011

Three Talks in Boston This Week

On Wednesday, April 13th I'll be speaking at Brandeis University:
Disasters and Climate Change: The Science and The Politics
5pm-6pm, Heller Schneider G3
Brandeis University
On Thursday, April 14th at 11am I'll be speaking at MIT:
Scientists in Policy and Politics
MIT Program in Atmospheres, Oceans, and Climate
11am-Noon 54-915

Scientists, and experts more generally have choices about the roles that they play in today's political debates on topics such as global warming, genetically modified foods, and food and drug safety, just to name a few. This talk is about understanding these choices, their theoretical and empirical bases, what considerations are important to think about when deciding, and the consequences for the individual scientist and the broader scientific enterprise.
Then later that day at MIT, if I am still coherent, I'll be speaking at 5PM:
The Climate Fix
MIT Energy Club, MIT Program in Atmospheres, Oceans, and Climate, and MIT Science Policy Initiative
5-6pm, E51-325

The world’s response to climate change is deeply flawed. The conventional wisdom on how to deal with climate change has failed and it’s time to change course. To date, climate policies have been guided by targets and timetables for emissions reduction derived from various academic exercises. Such methods are both oblivious to and in violation of on-the-ground political and technological realities that serve as practical “boundary conditions” for effective policy making. Until climate policies are designed with respect for these boundary conditions, failure is certain. Using nothing more than arithmetic and logical explanation, this talk provides a comprehensive exploration of the problem and a proposal for a more effective way forward.

Blind Spots in Australian Flood Policies

John McAneney of Risk Frontiers at Macquarie University in Sydney identifies some opportunities for better flood policies in Australia. First, he explains that better management of flood risks in Australia will depend up better data on flood risk.  However, collecting such data has proven problematic:
As many Queenslanders affected by January’s floods are realising, riverine flood damage is commonly excluded from household insurance policies.

And this is unlikely to change until councils – especially in Queensland – stop dragging their feet and actively assist in developing comprehensive data insurance companies can use.

Why? Because there is often little available information that would allow an insurer to adequately price this flood risk.

Without this, there is little economic incentive for insurers to accept this risk. It would be irresponsible for insurers to cover riverine flood without quantifying and pricing the risk accordingly.

The first step in establishing risk-adjusted premiums is to know the likelihood of the depth of flooding at each address. This information has to be address-specific because the severity of flooding can vary widely over small distances, for example, from one side of a road to the other.

Risk Frontiers is involved in jointly developing the National Flood Information Database (NFID) for the Insurance Council of Australia with Willis Re, a reinsurance broking intermediary. NFID is a five year project aiming to integrate flood information from all city councils in a consistent insurance-relevant form.

The aim of NFID is to help insurers understand and quantify their risk. Unfortunately, obtaining the base data for NFID from some local councils is difficult and sometimes impossible despite the support of all state governments for the development of NFID.

Councils have an obligation to assess their flood risk and to establish rules for safe land development. However, many are antipathetic to the idea of insurance.

Some states and councils have been very supportive – in New South Wales and Victoria, particularly. Some states have a central repository – a library of all flood studies and digital terrain models (digital elevation data).
Council reluctance to release data is most prevalent in Queensland, where, unfortunately, no central repository exists.

A litany of reasons is given for withholding data. At times it seems that refusal stems from a view that insurance is innately evil. This is ironic in view of the gratuitous advice sometimes offered by politicians and commentators in the aftermath of extreme events, exhorting insurers to pay claims even when no legal liability exists and riverine flood is explicitly excluded from policies.
Second, models of flood risk are sometimes misused:
Another issue is that many councils only undertake flood modelling in order to create a single design flood level, usually the so-called one-in-100 year flood. (For reasons given later, a better term is the flood with an 1% annual likelihood of being exceeded.)

Inundation maps showing the extent of the flood with a 1% annual likelihood of exceedance are increasingly common on council websites, even in Queensland. Unfortunately these maps say little about the depth of water at an address or, importantly, how depth varies for less probable floods. Insurance claims usually begin when the ground is flooded and increase rapidly as water rises above the floor level.

At Windsor in NSW, for example, the difference in the water depth between the flood with a 1% annual chance of exceedance and the maximum possible flood is nine metres.

In other catchments this difference may be as small as ten centimetres. The risk of damage is quite different in both cases and an insurer needs this information if they are to provide coverage in these areas.

The ‘one-in-100 year flood’ term is misleading. To many it is something that happens regularly once every 100 years — with the reliability of a bus timetable. It is still possible, though unlikely, that a flood of similar magnitude or even greater flood could happen twice in one year or three times in successive years.

The calculations underpinning this are not straightforward but the probability that an address exposed to a 1-in-100 year flood will experience such an event or greater over the lifetime of the house – 50 years say – is around 40%. Over the lifetime of a typical home mortgage – 25 years – the probability of occurrence is 22%. These are not good odds.
More on Risk Frontiers at Macquarie University here.