31 July 2011

Reader Mail on In-State Tuition Reform

My column in-state tuition reform was reprinted today in the Dallas Morning News and prompted this response from a former CU parent which came in by email:
I was fortunate to see your article about in-state tuitions in my Sunday Dallas Morning News and wanted to tell you how pleased I was to see a professor actually trying to do something about the college costing paradigm. My daughter was graduated from your Journalism school in December of this past year and while we paid out of state rates--I feel that she/we received a reasonable value for our money; increased academic rigor, and more discussions of economic realities would have probably made her current job search slightly more palatable and productive. Your underlying assertions about needing to drive the University toward greater greatness for the good of all truly resonate, because at some level a rational parent looks at their child and says, "okay, now over the next 4 years I am going to plunk down $160k to get a diploma," is that worth as much as potentially staking the child to ownership of a small business? The answer really is: largely only if the diploma has some resultant training of the mind, some affiliation benefit, and some reputational worth. The balance can be found on the internet or for less cost elsewhere.

Too many college administrators appear to believe that the "wisdom" the professors bring to the classroom by their presence is either of such inestimable value or uniqueness that we should be honored to shower them with money. That isn't to say I begrudge any University what they charge, because for many students especially in the college age range, it is only by bringing the subjects to life that learning and inspiration occur and when they drive and inspire ground-breaking research, lifelong careers, and changed behaviour patterns the educators have effectively altered the course of history. Still the reality remains that students are going to be motivated by something, and 99% of what is taught is readily available through books and the internet so the schools really need to understand and focus on those aspects of the college investment that make one school more valuable than the next--and unless you want to compete on price, I would suggest that your essay is a good start for looking at what continues to make being a "Buff" worth the dollars people are now paying. And pushing this value equation could be the right catalyst for propelling your school to next level.


Dave Lapka

29 July 2011

Scientists: You Are No Longer Politically Useful

[UPDATED 8/1: I've clarified the sentence below about Monnett's statement about "sloppy" research, thanks to the readers in the comments.]

Charles Monnett is a a wildlife biologist with the U.S. Bureau of Ocean Energy Management, Regulation and Enforcement in the US Department of the Interior. He has published in the peer-reviewed literature on polar bears. He also has just been suspended by the agency (PDF) under claims of possible scientific misconduct related to his polar bear research and was recently  interviewed by criminal investigators.

The Guardian reports that the timing is suspicious due to forthcoming DOI action on oil drilling in the Arctic:
It was seen as one of the most distressing effects of climate change ever recorded: polar bears dying of exhaustion after being stranded between melting patches of Arctic sea ice.

But now the government scientist who first warned of the threat to polar bears in a warming Arctic has been suspended and his work put under official investigation for possible scientific misconduct.

Charles Monnett, a wildlife biologist, oversaw much of the scientific work for the government agency that has been examining drilling in the Arctic. He managed about $50m (£30.5m) in research projects.

Some question why Monnett, employed by the US Bureau of Ocean Energy Management, Regulation and Enforcement, has been suspended at this moment. The Obama administration has been accused of hounding the scientist so it can open up the fragile region to drilling by Shell and other big oil companies.

"You have to wonder: this is the guy in charge of all the science in the Arctic and he is being suspended just now as an arm of the interior department is getting ready to make its decision on offshore drilling in the Arctic seas," said Jeff Ruch, president of the group Public Employees for Environmental Responsibility. "This is a cautionary tale with a deeply chilling message for any federal scientist who dares to publish groundbreaking research on conditions in the Arctic."
The spectacle of a government scientist being queried by criminal investigators is pretty chilling, and evocative of the same sort of ham-handed behavior by the Bush Administration.  While Monnett admits  argues that if the investigators apparent allegations were correct he would only be guilty of to doing sloppy research (PDF), it appears that neither he nor anyone else has been provided with information on what exactly he has been charged with related to scientific misconduct.

However this situation is resolved -- whether there is actual misconduct or it is a politically motivated harassment, both or neither -- the handling of this case is not a high point for the Obama Administration's efforts to "restore scientific integrity." More amazing is that as of this writing the entire liberal blogosphere (OK, Mother Jones just put up a post as I have been drafting this) that was so agitated about a "war on science" during the Bush Administration has thus far ignored this case. One can imagine what would have been the reaction had this occurred in 2006.

For scientists, the lesson here should be clear -- your status with partisans is a function of your perceived usefulness to their political agenda of the day.

28 July 2011

The Simple Math and Logic Underpinning Climate Pragmatism

Here is an open invitation to Joe Romm, David Roberts, Michael Tobis and any other self-proclaimed "climate hawks" to explain what is wrong with the math and logic presented below. This is the math and logic that underpins the arguments of "climate pragmatism" -- such as espoused in the report released yesterday by that name, The Hartwell Paper and The Climate Fix.

The "climate hawks" have usually been pretty loathe to engage in open intellectual debate, preferring instead to lob ad homs and mischaracterizations.  (Maybe they should be called "climate chickens" -- that is a joke;-)  So here I make it easy for them.

Below, I have broken out an argument into 10 points to make it easy for critics to identify where they disagree and provide evidence to the contrary. So here is a chance -- an open invitation even -- for them to point out errors in the logic and math behind climate pragmatism.

1. Decarbonization refers to a decrease in the rate of carbon dioxide emissions divided by GDP.

2. Stabilization of carbon dioxide concentrations in the atmosphere (at any low level, but let's say 350 ppm to 550 ppm for those who want a number) requires a rate of decarbonization of >5% per year.

3. The world has been decarbonizing for at least 100 years, and the rate of this decarbonization was about 1.5% from 1980 to 2000.

4. In order to get from a rate of 1.5% (or smaller) to higher rates, such as >5%, requires that decarbonization be accelerated.

5. However, the world has in recent years seen rates of decarbonization decelerate and in the most recent few years may have even been re-carbonizing, that is, the ration of CO2/GDP has been increasing.

6. In 2010 the United States re-carbonized as well.

7. Efforts to secure a global treaty or comprehensive national legislation in the US have not led to an acceleration in rates of decarbonization.

8. In fact, no country or group of countries in the world, despite their statements or policies, has ever achieved sustained rates of decarbonization exceeding 5% per year.

9. Contracting the global economy is not a viable tool for accelerating rates of decarbonization.

10. Actions that lead to an increase in rates of decarbonization are desirable, even if they are justified for reasons beyond climate change.

Two quick points before leaving it to the discussion in the comments . . .

First, I recognize that not everyone starts with an acceptance of the assumptions behind statement #2 above -- that is OK, this post is focused on the arguments of the "climate hawks" who obviously do accept the assumptions behind that statement. Please don't clutter the comments taking issue with the premise there (in fact, for those who do, just start with statement #10;-). 

Second, statement #10 above leaves unaddressed the answer to the question, "what actions will lead to an accelerated decarbonization of the global economy?" The honest answer is that no one actually knows how to accelerate decarbonization to >5% per year for a period of decades. Climate pragmatism says that we should look around and see what actions are actually moving in the right direction and to build upon those. In contrast, climate idealism holds that a comprehensive solution implemented all at once is the only acceptable course of action, and absent the ideal, even moving in the wrong direction is preferable.

Pragmatism is about taking the first steps on a long journey and not a comprehensive plan for how the last steps will be taken. That is how we fight disease, manage the economy and win wars.  Climate change will be no different.

There is more to argue of course, but let's start here and see where the critics find fault -- or if they engage at all.

27 July 2011

Commentary in the Chronicle on In-State Tuition Reform

I have a commentary in this week's issue of the Chronicle of Higher Education, and since being put online over the past weekend has led to a lot of reaction, including several requests to republish.  To preempt some of the misdirected critique that I've already seen -- the piece does not call for the end of in-state tuition, the argument is more nuanced than that, and the piece does not call for the end of state subsidies for college attendance by in-state students.

To see what the commentary does argue, see it in full here, and feel free to come back here to discuss and debate!

26 July 2011

Climate Pragmatism

In February, 2010 I participated in a workshop outside of London organized by Gwyn Prins (LSE) and Steve Rayner (Oxford) to discuss post-Copenhagen climate policy.  One important result of that workshop was a white paper, called The Hartwell Paper (named for the location of the meeting).

The Hartwell Paper continues to receive a lot of attention around the world -- particularly in Europe and Asia.  But it has received only minimal attention in the United States.  That is perhaps understandable as only four of the paper's co-authors are based in the US (Shellenberger, Nordhaus, me and Frank Laird) and the arguments did not engage the idiosyncrasies of the contemporary US political landscape.

So a small group of academics and think tankers from across the political spectrum decided that it might be worth trying to explicitly re-interpret the message of The Hartwell Paper in a US context.  So a while back we met in Washington, DC to discuss and debate.  The result is the paper released today -- Climate Pragmatism.

Here is an excerpt:
A new climate strategy should take a page from one of America’s greatest homegrown traditions — pragmatism1— which values pluralism over universalism, flexibility over rigidity, and practical results over utopian ideals. Where the UNFCCC imagined it could motivate nations to cooperatively enforce top-down emissions reductions with mathematical precision, US policymakers should acknowledge that today’s global, social, and ecological systems are too messy, open, and complicated to be governed in this way. Whereas the UNFCCC attempted to create new systems of global governance, a pragmatic approach would build upon established, successful institutions and proven approaches. Where the old climate policy regime tried to discipline a wildly diverse set of policies under a single global treaty, the new era must allow these policies and measures to stand—and evolve— independently and according to their own logic and merits. And where the old regime required that everyone band together around the same core motivation and goals, policymakers today are likely to make the most progress to the degree that they refrain from centrally justifying energy innovation, resilience to extreme weather, and pollution reduction as “climate policy.”
As far as I'm concerned (not sure if my co-authors would all agree), any debate over whether climate policy should move in a more pragmatic direction has been decided -- decisively.  It has and will.  the only real question is how quickly climate activists and policy experts decide to sail; with the prevailing winds, rather than against them.

Please read the paper and come back here and debate, discuss, critique.

24 July 2011

Bayh-Dole Meets Pay-for-Play

I have a commentary in today's Boulder Daily Camera on  "pay for play" in college athletics.  I argue that helping student athletes capitalize on their economic opportunities is a good idea.  The faculty, most of all, ought to be sympathetic since we went through our own "pay for play" debate 30 years ago.

Here is how it starts:
The on-again off-again debate over whether or not college athletes should be paid is once again heating up. In the Camera (July 15) Neill Woelk argued that paying athletes doesn`t make sense, not least because universities are broke.

As a professor at the University of Colorado, this is a reality I know all too well. Since 2005, CU has paid out $5 million to poorly performing head football coaches. Meantime, I and my faculty colleagues who bring in hundreds of millions of dollars in research funding per year have foregone raises the past three years. It could be worse though, the Chronicle of Higher Education reports that Rutgers University has subsidized its athletic program to the tune of $115 million since 2006, while at the same time foregoing raises across campus last year to save $30 million.

With this background you might expect me to be against paying college athletes, or perhaps even against college athletics in general. To the contrary, college athletics are a great American tradition and an important part of our university culture where we strive for excellence in everything we do. But the first thing to realize in this debate should be obvious -- we already pay college athletes. The right question to ask is whether they should have an opportunity to be paid more. I think that the answer is yes.
Head over to the Camera for the rest, and please come back here and let me know what you think!

21 July 2011

The Policy Advisor's Dilemma

My latest column for Bridges is out.  In it I discuss a challenge faced by the policy advisor -- how to be close enough to a decision maker to participate in decision making, but not so close that independence is lost.

I draw on the work of my friend and colleague Eva Lövbrand who has looked in depth at the role of the EU ADAM project in relation to the needs of decision makers. I compare her analysis of ADAM's research in decision making with the role of intelligence in the decision to go to war in Iraq (an issue I discuss at length in The Honest Broker).

In my column I ask:
What is the difference between the case of WMDs, where policy analysis was provided in response to the stated needs of decision makers, and the case of ADAM in which policy analysis was similarly provided in response to the stated needs of decision makers?
For the answer, please have a look at the column, and please feel welcome to come back here with comments and critique.

Reference:  Lövbrand, E. (2011). Co-producing European climate science and policy. A cautionary note on the funding and making of useful knowledge. Science and Public Policy 38(3): 225-236.

20 July 2011

When Politicians Put Experts Between a Rock and a Hard Place

I have been following closely, but not writing much on, the debate in Australia over Julia Gillard's proposed carbon tax.  How it plays out will be fascinating to watch and will provide as much a lesson in Australian politics as anything to do with climate policy.

This report from The Australian provides a great example of how politicians can make life extremely difficult for those experts who share their goals:
The Prime Minister took her carbon tax pitch to the heart of Australia's $40 billion coal sector today, telling NSW miners her plan wouldn't place their jobs at risk.
 Ms Gillard told workers at Mandalong's Centennial Coal, in the Hunter Valley, that the mine would stay open for as long as there was coal in the ground.

"This mine will continue to work for those 20, 25 years," she said. "It will continue to be here until the end of its productive life."
She goes further even,
The federal government has committed $1.3 billion to protect coal jobs, while Treasury modelling says the industry's output will more than double between 2010 and 2050 under the carbon tax.

But it also says the proportion of Australia's energy supply derived from coal will fall from 80 per cent now to 20 per cent within 40 years. . .

Earlier, Ms Gillard was tackled on ABC radio over the impact of her carbon tax on Australia's biggest coal port.

“How can (the tax) not have a negative impact on economic growth in this region?” an ABC Newcastle presenter asked.

The Prime Minister said Australia would continue to export coal under her carbon tax, dismissing suggestions Chinese demand would tail off as a result.

“There's a strong future for coal mining in this country, it will continue to grow. Employment will continue to grow,” she said.
Tony Abbott, the opposition leader, is plenty happy to hear this line of argument from Gillard:
But the Opposition Leader, speaking in Victoria, said the government plan clearly stated that coal would produce just 20 per cent of the nation's power by 2050.

"The Prime Minister should stop trying to pull the wool over the eyes of people in coal mining regions," Mr Abbott said.

"The whole point of a carbon tax is to get us using less coal. That means less production, less investment and less employment in the coal industry." . . .

Mr Abbott said: "How can it be that it is wrong to burn Australian coal in Australia but it is somehow right to burn Australian coal in China?"
Gillard may or may not believe what she is saying about the future of coal production in Australia -- politicians say all sorts of things in the heat of political battle.  What would be interesting would be to see how policy experts who know better who support the proposed tax respond to a question such as the following:

Is Julia Gillard's commitment to increasing coal production in Australia in the coming decades consistent with efforts to accelerate the decarbonization of the global economy?

From a policy or mathematical perspective there is an obviously correct answer to this question -- No.  It may be the case in the context of Australia's current debate that from a political perspective (as a matter of crass expediency) there is a different answer.  How experts deal with this conflict between policy and politics makes for an interesting case study in the politics of expertise.

One the one hand, if an expert answers the question posed above accurately, then s/he will be seen as giving support to the criticisms levied by Tony Abbott against the proposed tax.  On the other hand, if the expert supports the claims made by Julia Gillard, then s/he will be saying something that is incorrect, giving further ammunition to the opposition.  What would you do? 

I'll be looking for how experts address this issue, and I'd welcome your pointers as well.

High Praise for The Climate Fix

Sean Sublette, a broadcast meteorologist at ABC-13 in Virginia, has posted on his station's Weather Expert's blog the sort of book review that warms an author's heart.  Here is an excerpt from his review of The Climate Fix:
Shortly after I discovered his blog, He published a book entitled, The Climate Fix: What Scientists and Politicians Won't Tell You About Global Warming. At first glance, I figured it was another one of those books that either told you the whole blessed science was a hoax, or that we would all die by the year 2050.

It is neither. Not even close. After a recommendation from a well respected colleague, I picked it up.

Despite its title, I suspect Pielke is not telling us how to fix the problem. He is wise enough to understand there is still much uncertainty about this topic and there are no easy solutions.  My guess is that he is referring to the fix that we are in right now, regarding the state of the science, and the political climate in which the science is conducted.

He does a superb job illustrating the strengths and weaknesses of the science, and he is very honest and forthright about how politics plays into this topic. It is one of the most refreshing and honest books I have read about anthropogenic climate change.

There is plenty in this book that will anger those individuals who inhabit the poles of this topic. And that is probably why I like it so much. No one has all the answers. Pielke makes this abundantly clear.

In a television world full of spin, I was almost giddy to see his levelheaded conclusions. The science is sound, but like most sciences, it needs refining. Many politicians and some scientists are abusing the science to advance their causes. Big energy innovations are needed. Geoengineering is a really bad idea. Best new term you will learn in this book: Iron law of climate policy.

Everyone has to find his/her own way on this topic, but there is a lot of noise out there. Hot blogs and cable news only contribute to the noise. Pielke's book provides real signal that rises above the cacophony.

I generally don't recommend books, but this is one that every meteorologist who speaks publicly on the issue should be familiar with. Very few people are qualified to write this book, but Pielke is one of them.
 Thanks Sean, I'm glad you liked the book!

19 July 2011

Sarewitz on NSF Broader Impacts

Dan Sarewitz has a column in Nature this week critical of NSF's well-meaning but ill-suited effort to modify its "broader impact criterion" -- the so-called "Criterion 2" --for the evaluation of specific project proposals.  Here is an excerpt (PDF):
Last month, the board published a revised criterion, and scientists had until this week to provide comments to the NSF before the final version is issued. But Criterion 2.1, as it might be called, is just as confusing and counterproductive as its predecessor.

At the heart of the new approach is “a broad set of important national goals”. Some address education, training and diversity; others highlight institutional factors (“partnerships between academia and industry”); yet others focus on the particular goals of “economic competitiveness” and “national security”. The new Criterion 2 would require that all proposals provide “a compelling description of how the project or the [principal investigator] will advance” one or more of the goals.

The nine goals seem at best arbitrary, and at worst an exercise in political triangulation. How else to explain the absence of such important aims as better energy technology, more effective environmental management, reinvigorated manufacturing, reduced vulnerability to natural and technological hazards, reversal of urban-infrastructure decay or improved performance of the research system? These are the sorts of goal that continue to justify public investments in fundamental research.

Yet, more troubling than the goals themselves is the problem of democratic legitimacy. In applying Criterion 2, peer-review panels will often need to choose between projects of equal intellectual merit that serve different national goals. Who gave such panels the authority to decide, for example, whether a claim to advance participation of minorities is more or less important than one to advance national security?
Sarewitz also makes this important point:
To convincingly assess how a particular research project might contribute to national goals could be more difficult than the proposed project itself.
Rather than asking PIs to do what is essentially impossible or at least far beyond their expertise, Sarewitz suggests that the locus of accountability needs to move higher up in the agency:
Motivating researchers to reflect on their role in society and their claim to public support is a worthy goal. But to do so in the brutal competition for grant money will yield not serious analysis, but hype, cynicism and hypocrisy. The NSF’s capacity to meet broad national goals is best pursued through strategic design and implementation of its programmes, and best assessed at the programme-performance level. Individual projects and scientists should be held accountable to specific programmatic goals, not vague national ones.
Ultimately, it is important to recognize that NSF is one of only three agencies with a legislated mandate to do "basic research" rather than agency-mission focused research (the two others being NASA and DOE's Office of Science). Asking NSF to become more mission oriented undercuts the overall purpose of the agency.

In my view (Sarewitz does not go this far) NSF should scrap the Criterion 2 at the project level and evaluate proposals on their scientific merit using peers with appropriate expertise. Setting the overall direction of NSF's research portfolios will inevitably be a political exercise, and that is a more appropriate location for efforts to better connect NSF with national needs.

13 July 2011

The 2011 Tornado Losses in Context: A Preliminary Analysis

I spent the past several days in Hamilton, Bermuda at a conference organized by the Geneva Association (agenda here in PDF). I gave a talk that used several analyses of US hurricane losses to illustrate the importance for decision making of distinguishing risk (situations where probabilities can be known) from ignorance (situations where they cannot).

Among the many interesting presentations at the meeting was an excellent discussion of the 2011 US tornado season by Kevin Simmons, a professor of economics at Austin College in Dallas. Kevin is one of the nation’s experts on the societal impacts of and responses to tornadoes. He is a co-author (along with Daniel Sutter, an economist at Troy University) of the recent book, Economic and Societal Impacts of Tornadoes (University of Chicago Press, 2010).

My attention was drawn to one slide in particular from Kevin’s talk (shown at right) which updates the tornado damage figures first presented in his book (with data for 1950-2007) through 2010 (the data is adjusted for inflation). The Simmons/Sutter dataset is the only one that I am aware of that shows US tornado damage since 1950, and is drawn from data kept by the US National Weather Service using a consistent methodology. (Note that the data since the mid-1990s is more accurate than that collected previously, because the NWS cataloged damage using a range in the earlier years. Kevin tells me that they used the midpoint of the range estimates in assembling the data for the earlier years.)

The large economic losses due to the tornadoes of 2011 raise an interesting question: How unusual are the tornado losses of 2011?

The Simmons/Sutter dataset provides one means to answer this question. Regular readers will know that it is not enough to look at disaster data over time, even if it has been adjusted for inflation, for the simple reason that society is always changing. The exact same weather extreme occurring years apart will cause more damage if people build more property and accumulate more wealth in the area affected. Simmons figure shown above has no such “normalization” applied to it.

Actually performing a full normalization of tornado loss data will require some considerable research and effort, and Kevin and I have agreed to take this on together, with a paper on the subject the ultimate goal. But for now, simply because I am curious and have some time to kill on a plane over the ocean, I have calculated the ratio of tornado damage to US GDP for the period 1950 to 2011 as a rough first cut. This calculation should be considered preliminary and, of course, is not peer reviewed.

Here is what I've done: First, here is a graph of tornado damage to GDP for the period 1950 to 2010 using the data from Simmons’ presentation (from the graph shown at the top of this post) and US GDP data from BEA. The graph is scaled such that the 1950 to 2010 average ratio is 100.

Remarkably, only one year (1989) was above the long-term average in the final 28 years in the series (1983-2010). By contrast 13 of the previous 33 years were above average (1950-1982). While GDP is a rather blunt tool for a normalization, it makes more sense for tornadoes and floods than hurricanes (because hurricanes are geographically concentrated, floods and tornadoes less so). Also, a normalization of hurricane losses adjusted for GDP has been shown to fairly closely approximate a more sophisticated methodology, so it is not unreasonable to start there. The data shows that as a fraction of GDP, tornado damage decreased appreciably from 1950 to 2010.

What about 2011? In his talk, Simmons presented a range of estimates for the costs of the 2011 tornadoes, with the largest being $23.2 billion. He expressed some skepticism about that level being ultimately reached, but said that it was “in the realm of possibility” so I’ll use that number. I also assume that 2011 GDP is 3% higher than 2010. The following graph shows the tornado damage to GDP through present with the 1950-2010 average equal to 100.
You can see that 2011 is by this measure an extreme year, ranking number 3 since 1950. The loss-to-GDP ratio figure looks remarkably like the figure that shows casualties over time.  It will also be worth comparing to the incidence of strong tornadoes.  And just was the case with casualties, there is no indication of a secular trend in the direction of greater damage. In fact the opposite is the case.

This preliminary analysis (and note that it is preliminary; it is not a prediction of what future research might show) suggests that while the costliest tornado season in absolute terms by a large degree, the impacts seen in this tragic season should not be considered to lie outside of the range of expectations based on historical data on losses in the context of the overall economy. Based on this preliminary look at the data, since 1950 there is no evidence of an increase in damage in tornado loss data adjusted for inflation or that compared to GDP.

11 July 2011

American Politics in a Single Graph

H/T: Lowy Interpreter

Making Stuff Up at Real Climate

Real Climate has an interesting post up today in which Gavin Schmidt reports that, according to a new analysis, errors in climate data (ocean temperatures) may lead to a reduction in the 1950-2006 global temperature trend of 17%, perhaps more.  This is a big deal scientifically and speaks to the certainty – in this case excessive -- with which climate science is often reported, especially by the IPCC.

As far as Real Climate is concerned it is interesting that they seem to think that the story here is not the revisiting of the science of global temperatures, but how they can score some points against fellow bloggers (Maybe trying to change the subject?).

Unfortunately, in my case (and according to Steve McIntyre, in his as well) they engage in some fabrication to try to score those points, incorrectly claiming that I had offered a “prediction” of how the science on this issue would evolve. When called on this, Gavin first admitted that he could be confused (he was), but when I pointed out to him exactly how he was confused, he decided to dig in his heels.

Actually, on Prometheus I and a number of commenters did what people normally do when they hear about interesting science -- we discussed, probed, questioned, hypothesized, explored. Schmidt seems upset that people engaged the subject at all. For my part, I discussed the issue of temperature adjustments in some depth (e.g., here) and offered up a few conditionals that spanned the scope of possibilities (and event had an exchange with Gavin et al. on the subject). But I offered no predictions of how the science would turn out.  As readers here know, I predict football but not science.

Taking a look back at my discussion of the temperature trend issue at Prometheus from 2008 for the first time since it was written, it actually stands up pretty well:  I asked, “Does the IPCC’s Main Conclusion Need to be Revisited?”  The answer would seem obviously to be “yes” if it is indeed the case that 17% of the global surface temperature trend that the IPCC thought it had fully accounted for was actually measurement error.  Oops.  But that sort of thing -- learning something new about something we thought we had settled -- happens in science, and it should not be a surprise or a scandal.

But that is just science.  On the apparently much more important issue of the blog wars, Gavin Schmidt has decided to let his fabrication stand and has encouraged and published the usual cheerleaders piling on, adding to the misinformation in the comments, unfortunately making this post necessary.  Richly ironic.  I do not miss sparring with those guys.

10 July 2011

A Bad Analogy in Australia's New Climate Policy Proposal

Australia has released its much awaited carbon tax proposal (here in PDF).  I am just now browsing through it.  This analogy in the document strikes me as particularly unfortunate:
The Government has committed to reduce carbon pollution by 5 per cent from 2000 levels by 2020 irrespective of what other countries do, and by up to 15 or 25 per cent depending on the scale of global action.

Meeting the 5 per cent target will require abatement of at least 159 Mt CO2-e, or 23 per cent, in 2020 (Figure 2.4).1 This is equivalent to taking over 45 million cars off the road by 2020.
Why do I say an unfortunate analogy?

Well, Australia has only about 12 million cars (and 16 million total vehicles), so using a reduction of 45 million cars "off the road" to illustrate the unilateral emissions reduction goal simply illustrates the impossibility of the task. Under this analogy, even getting rid of all of the vehicles in Australia would leave the country about 30 million vehicles short (for better ways to illustrate the magnitude of the Australian emissions reduction challenge, more see this paper). In any case, at current rates of growth Australia will have 5 million more vehicles in 2020, and not any less.

Back to browsing -- I hope that other aspects of the policy don't prove to be similarly impossible.

08 July 2011

Space Shuttle Costs Revisited

[UPDATE: More here from NPR.]

As the space shuttle orbits the earth for the last times, the WSJ Numbers Guy has an interesting overview of efforts to calculate the costs of the Space Shuttle program over its life. Here is an excerpt that describes how I first got onto this subject about 20 years ago (see, e.g., this paper in PDF):
Roger Pielke Jr., a political scientist at the University of Colorado, Boulder, first estimated the shuttle's cost to the National Aeronautics and Space Administration through the early 1990s. He was surprised to be assigned the project by his master's thesis adviser, Rad Byerly, who had just completed a stint as staff director of a House space and aeronautics subcommittee. "I said, 'Isn't this something you could snap your fingers and find out?' " Prof. Pielke recalls.
It is strange that NASA suggests to the WSJ that the best way to add up budget numbers is to avoid adjusting inflation. In my classes on budgeting that perspective earns a failing grade.

The Social Construction of Black Swans

A "black swan" event in the words of Nassim Nicolas Taleb is characterized as follows:
What we call here a Black Swan (and capitalize it) is an event with the following three attributes. First, it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme impact. Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.

I stop and summarize the triplet: rarity, extreme impact, and retrospective (though not prospective) predictability. A small number of Black Swans explains almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our own personal lives.
The Fukushima nuclear disaster certainly would seem to qualify.  However, a closer look at why the disaster occurred reveals that it was a black swan of our own making.  Here is an explanation why this is so from a recent WSJ story (thanks TB! Emphasis added.):
A design flaw at Japan's Fukushima Daiichi nuclear plant -- one that senior engineers had known about for years -- caused the cooling systems to fail at four reactors after the March 11 earthquake and tsunami, according to company records and interviews with current and former employees.

Tokyo Electric Power Co., the plant's operator, used two different designs for protecting the backup generators for its cooling systems at 10 reactors in the Fukushima region. The cooling systems at the reactors with the older design failed, causing fuel meltdowns and explosions.

The older Fukushima reactors, dating back to the 1960s and built with General Electric Co.'s Mark I design, housed their electric-switching stations in exterior buildings vulnerable to the tsunami's waves. Newer reactors, meanwhile, house these stations in their sturdy main buildings.

When the waves knocked out the switching stations at the older reactors, they rendered the backup generators useless. "Once water gets in there, the whole thing is kaput," said Katsuya Tomono, a former TEPCO executive vice president.

GE said its reactors are safe and that any design flaws are TEPCO's fault because the company was in charge of design changes. Current and former TEPCO engineers say that in retrospect, they should have done something about the flaw.
It is well understood that black swan events emerge from the murky unknown.  What is less appreciated is that the murky unknown is often a region of our own creation.

07 July 2011

Why the IPCC Has Lost Trust

The IPCC is now one train wreck after another.  After being embarrassed by the spectacle of a Greenpeace energy scenario being elevated to top level prominence in a recent report on renewable energy by an IPCC author from Greenpeace, the IPCC compounds that error by trying to explain it away with information that is at best misleading if not just untrue.

In a letter to the Economist this week Ottmar Edenhofer, co-chair of IPCC Working Group III (which produced the recent report on renewables) dresses down the magazine for not recognizing that the IPCC has procedures in place to deal with the possibility that authors might impose their biases:
The IPCC has now approved a formal policy on conflicts of interest as recommended by the InterAcademy Council, a network of national science councils. This is an already endorsed increment in a pervasive system and is not a first step in a whole new area. Our new special report on renewables continues the tradition of balanced, thorough assessments at the IPCC.
What Edenhofer does not mention is that the IPCC conflict of interest policy is not being implemented until some time after 2014, after the current (fifth) assessment report is done (of course, nor did it apply to the recent renewables report). The yet-to-be-implemented COI policy is completely irrelevant to any discussion of the renewables report.

The IPCC Chairman Rajendra Pachauri explained the reason for the delayed implementation recently to the Economist:
Of course if you look at conflict of interest with respect to authors who are there in the 5th Assessment Report we’ve already selected them and therefore it wouldn’t be fair to impose anything that sort of applies retrospectively.
If you think about it, fairness to IPCC authors who have conflicts of interest (most notably Pachauri himself) is an interesting concept. One might argue that the legitimacy of the organization outweighs a need for such fairness to conflicted authors, but I digress. 

The IPCC involves many sincere people who put forth a lot of effort. It is a shame to see that effort repeatedly scuppered on the inability of the IPCC leadership to recognize that trust and legitimacy are essential to its job. When will the climate science community stand up and demand more effective leadership? 

What I Did on My Summer Vacation

I've started a new blog.  It's called The Least Thing (screen shot above) and it will occupy an important slice of my time going forward. It will be the home of my sports-related blogging (and soon, research) including of course the EPL and Bundesliga prediction contests which are coming soon -- as soon as the damn transfer market gets sorted (I may have to figure out how to tell my 7 year-old that his Nasri poster has to come down).

This blog will continue more or less as before (but probably a bit less) minus the sports excursions. To all my valued commenters on sports on and off blog, please head over to The Least Thing!

06 July 2011

An American Wivenhoe

Long-time readers may remember discussions here earlier this year about the management of the Wivenhoe Dam near Brisbane, Australia and the role of such management in the flooding of Brisbane.  One of the issues in that flooding was the role of reservoir management in the magnitude of the flood, which I described as follows:
Wivenhoe Dam near Brisbane, Australia is at the center of controversy in its role in the recent flood. The dam, as is commonly the case, is expected to serve two seemingly contradictory functions.  On the one hand it is a buffer against drought, meaning that it is desirable to keep it more full in the eventuality of low precipitation.  On the other hand, the dam is a buffer against floods, meaning that it is desirable to keep it more empty in the eventuality of heavy precipitation.  Since keeping the reservoir full and empty are not simultaneously possible, it then is necessary to balance these objectives.  Since future precipitation is uncertain, the dam's management is thus a matter of decision making under uncertainty (where risks are known) and ignorance (where they are not).
A reader (thanks DB!) passes on a lengthy article from the Great Plains Examiner out of Bismark-Mandan, North Dakota on the management of upper basin reservoirs on the Missouri River.  The article describes a decision context and outcome remarkably similar to what we saw in Australia earlier this year.
The period between March 20 and May 6 has been difficult for the Army Corps of Engineers to explain. During that span, the Corps’ water managers kept river levels low and stockpiled near-record amounts of water behind the three upper basin dams on the Missouri River, despite evidence that the Rocky Mountains were holding a lot more snow than normal.

The reservoirs were so full by early May that they couldn’t contain the late-spring rainfall that pounded Montana and the Dakotas.

Public records studied by the Great Plains Examiner show Fort Peck, Garrison and Oahe dams each were holding more than 99 percent of their total water capacity in late April. Lake Sakakawea, the largest reservoir along the river, had risen 10 feet into the flood-control zone before the Corps of Engineers began ramping up release rates from Garrison Dam to create storage space for the heavy rain and melting snow.

Emergency releases from the reservoirs in June flooded communities along a 1,700-mile stretch of the Missouri River. Almost immediately, people who live in the watershed accused federal water managers of mismanagement, officials with the Corps of Engineers pointed at their operations manuals . . . They said the problem wasn’t how they managed the reservoirs; instead, they blamed a set of conflicting congressional mandates and pressure from political leaders up and down the river system to manage the water for special interests including recreational boaters, environmentalists and the energy industry.
The article includes a remarkable admission from Jody Farhat, the chief water manager for the Missouri River reservoir system:
By the end of April, Lake Sakakawea had risen to near-record levels and the Corps of Engineers realized for the first time that the mountains were holding about 40 percent more snow than average. Until then, the Corps was expecting about 10 percent more snow than normal, Farhat said.

“By May 1, we were going ‘Holy smokes, this is not just a little-above-normal water year. It’s way above normal,’” she said.

Larry Larson asks a question to which I suspect he has an answer:
“What were they doing in the winter months and early spring when this was building?” asked Larry Larson, executive director of the Association of State Floodplain Managers. “Were they preparing for it, or were they playing the odds and then found themselves caught in a box?”

05 July 2011

Another Extreme Event Caucus: National Journal

I am participating in another round-up of opinion on extreme events and climate change over at the National Journal.  This time it is with a group of political types, which so far includes Representative Earl Blumenauer (D-OR), David Hunter (IETA), Gene Karpinski (LCV), Dan Lashof (NRDC), Eileen Claussen (Pew Climate), Carl Pope (Sierra Club), Nathan Willcox (Environment America) and William O'Keefe (Marshall Institute).

My piece is essentially the same as what I provided to Yale e360 not long ago, just a little expanded.  The other submissions are far more interesting and in general would make great grist for an essay by the Bizarro World Chris Mooney.

David Hunter says some smart things about the science and while Rep. Blumenauer could not be more wrong about the science, he gets the policy conclusions exactly right.  Throughout there is the usual litany of recent extremes and their human and economic costs and assertions how they must be linked to human-caused climate change.  Support for these assertions are provided by mentioning news articles and NGO reports, several mentions of rolling a 13 with loaded dice and one extended analogy to splattering spaghetti sauce.

You'd think that with this line-up, William O'Keefe, the lone "skeptic" included in the round up would be able to hit a home run.  Instead, he strikes out.

A Measure of Fukushima's Impact on Nuclear Power

A lot has been written about the consequences of the Fukushima nuclear disaster for the future of nuclear power.  Much of the discussion has focused on the high-profile cases of Germany and of course Japan.  But overall, it seems that the aggregate effects of the disaster on the global prospects for nuclear power are pretty small. 

Evidence for this comes from this nugget buried deep in the FT, which I was very surprised to see (I would have thought that the terminated or delayed number would have been much larger):
Of 570 units planned before Fukushima, only 37 have been axed or put on hold since the crisis, according to Arthur D. Little, a consultancy.
And while Germany steps back from nuclear power, it looks like the UK is jumping back in with renewed vigor:
The British nuclear industry is about to enjoy a “renaissance” and the country must become the “number one destination” for investment in new reactors, the energy minister will say on Tuesday.

Charles Hendry will deliver the most enthusiastic ministerial endorsement yet of the nuclear industry’s ambition to build a new generation of power stations.

In his speech to the Nuclear Industry Association, seen by the Financial Times, Mr Hendry will say: “The UK has everything to gain from becoming the number one destination to invest in new nuclear. Nuclear is the cheapest low-carbon source of electricity around, so it can keep bills down and the lights on.”

A dozen new reactors are set to be constructed at eight sites in England and Wales, with the first due to be completed in 2018. The total cost of the programme, the most ambitious in Europe, is forecast to be at least £50bn.
Reports of the death of nuclear power have clearly been exaggerated.