Regional Climate Information: Real-world use (2)
Regional Climate Information: Real-world use (2)
There is, perhaps to an outsider, a curious contradiction evolving in the climate-science community. On one hand we have concluded that the warming of the planet is unequivocal, attributable to the release of greenhouse gases from fossil fuel consumption, and that we need to do things to limit the warming – which most simply is to reduce emissions from fossil fuels. Therefore, we feel that there is enough confidence in climate change projections to warrant changes in the very foundation of our energy consumption; hence, the foundation of our economic success.
On the other hand, when we talk about the need for climate projections to contribute to planning for adaptation to climate change, there are many in the community who will make the argument that the projections are so uncertain that it is not possible to provide such information. An example of this sort of decision might be the size of the drain pipes in the urban flood controls or the need for a barrier to protect city water supplies.
This contradiction, we know enough to say we have to do something, but not enough to say what to do, is not a comfortable situation. To some, it raises issues of basic credibility; it is definitely fuel for the political position that it is too risky to our economic well being to take action on climate change. This sort of contradiction is, however, not so unusual. Think about the floods in Pakistan. We knew for 2-3 weeks that the water was flowing down the Indus Valley, but in a general way, it was not clear what to do downstream. It is still flooded in the Sindh.
Those people most interested in developing adaptation plans often want numbers, digital data, for the year, say, 2040. Their intuition is to ask for data that looks like today’s weather station observations. The reason for this is relatively simple – there are present-day tools for design and warnings that have been developed to use weather data and weather forecasts. This recognizes the implicit fact that weather is how climate interacts with people.
Though we have developed some skill in seasonal prediction, largely based on our ability to predict the El Nino-La Nina cycle, we have not developed much skill in actual climate prediction (see for instance, here and here). By climate prediction I mean, for example, will there be a flood at the confluence of the Mississippi and Ohio Rivers in June of 2019? The conclusions that are drawn from climate models, with varying degrees of confidence, that there will be more intense floods and droughts use the models to provide guidance. This guidance is used in combination with understanding of basic theoretical knowledge like warm air holds more water; hence, it can support more intense storms. In some cases, we can use observations from the past to provide circumstantial evidence to support the robustness of our conclusions. With this information it is possible to provide guidance for those trying to make decisions, but it is a complex process that requires inputs from a variety of people who are knowledgable in the circumstances of a particular problem. This expert guidance or advice is sometimes referred to as translation, and more and more, we understand the need to have translators at the interfaces of all of the different types of expertise needed for problem solving. We understand the need to cogenerate solutions, and that one field of study, climate science, handing off information to another field of study, city wastewater managers, does not work so well. We simply do not have the ease of providing weather-like data without qualification.
I threaten to digress. A comprehensive climate model can provide a set of numbers that are time stamped with every hour of any year at every point on Earth. It is relatively straight forward to provide a bunch of numbers that look like the Wunderground Personal Weather Station network in Chicago for the year 2043. In fact, we have talked about this as a cool thing to do for the climate page. Most Wunderground devotees would immediately recognize that such a set of numbers may only constitute a party trick. There is not enough skill to pick out in which years there will be regional droughts, much less, whether or not it will rain in Naperville on July 4, 2043.
Nevertheless, there has grown up in the past few years a huge industry which not only takes archived climate model output and tries to increase the effective resolution through a variety of methods, but also use weather generators to generate daily high and low temperatures. This is called downscaling, the process of taking coarse resolution information and adding fine resolution information to customize it for a particular application. As you might imagine, there are widely varying opinions about this process. Some scientists think that this is a waste of time and resources, and others think it is a critical process in developing necessary climate adaptation plans. (For those who want to know more: a whole bunch of downscaling references from my class.) (and for your pleasure ClimateWizard and Canadian Climate Change Scenarios Network)
From a market perspective, there are many customers who want downscaled information and the basic information to feed downscaling algorithms is readily available through the CMIP-3 archive. Therefore, whether or not a subset of climate scientists think that downscaling makes sense, there will be downscaling of climate projections and use of that information.
Early in the 1990s I was involved in ozone research, and in particular, the development of weather-resolving global ozone models. These models challenged not only the computational resources of the time, but the human resources to evaluate their quality and interpret their results. In a meeting in the Damon Room of the National Center of Atmospheric Research, we were discussing the use of this new generation of model in official United Nations’ assessments of ozone depletion. I was on the side that it was too early to use these models, and that we needed months if not years to assess their quality and assure their robustness. On the other side was the argument that these new generation models WOULD be used; they existed, and someone would use them. One stream of the argument was that it was the responsibility of those most knowledgeable of both the strengths and weaknesses of the models to try them out in the assessment studies.
I was on the wrong side of that argument in the Damon Room. It was true that new generation models would be used for a whole variety of reasons, ranging from scientific reasons to reasons of one research group trying to make their mark relative to another group. Not only was there a responsibility for the leading research groups to participate, but there was also a lot to be learned from trying to do those assessments.
In the discussion about whether or not model projections are ready for applications, there are arguments made that addressing applied problems are not really science. That a focus on applications diverts resources from needed science and diverts the most trained minds away from needed research. Such a position, however, does not recognize the challenging research problems of how to use climate information in real-world applications. Neither does it recognize that the demand for information is there, and that that demand will be met in some way.
Imagine that you are spending money for bridges or power plants or flood control. These expenditures are expected to last generations. You know that you need to consider climate change. You need to consider climate change in concert with many other issues, and the question might reduce to what incremental change do I need to make to my plans to accommodate climate change. Or the question might be more severe – is salt water intruding into my water supply? As a decision maker you are concerned technically and ethically. You might need to answer to political concerns, and increasingly, you are answering to your insurance company. You need climate data now – you can’t wait until the skill score of decadal predictions improve.
The scientific investigation of climate has revealed the need to do something, and you cannot wait until a certain skill score is achieved in climate models. There are many ways you can get some information. It would be nice to get vetted and branded information, but in the absence of that, you can get some information. Usually the information that you will use with be strongly influenced by ease of access and use. It would be nice to have ease of access and use to the best available data at any given time. It is not a simple manner to define “best available;” it is not a simple thing to manage the logistics of access.
If you examine the problem from the generation of climate knowledge to the use of climate knowledge, then there are research issues all the way along the path. Above, I mentioned the need for cogeneration of solutions to problem. Cogeneration means that all of the information providers are working together in the generation of solution paths. With this participation, the users of climate information learn how to account for the uncertainties of climate projections in their problems and climate scientists learn the requirements that are faced by the users.
We have been studying the use of climate information for, at least, 20 years, and from this experience, we know that it is naïve to image simply providing digital data. There is a need to develop translation services to complement the digital data. We know from experience in the weather community, that the notion that we can make the provision of digital data operational, somehow separated from research, is far from optimal. It sets up barriers between new developments that might improve forecasts, and it sets up barriers on best use of information. The idea that we might wait until the climate projections achieve some undetermined skill level, then pass it off as useful, neglects the fact that the bottleneck in the use of climate data does not lie first and foremost in the quality of the climate projections. This position of wait until the data are better neglects the research from years of learning how to use climate data. In fact, to wait until the data are better serves to fuel a wait and see approach in the development of policy and development of solution paths. We have made the argument that climate projections are robust enough to motivate controlling the emissions of fossil fuels. We need to address with equal energy the problem of determining the size of the levees in Fargo and New Orleans.
First Blog in this series
Pakistan: I am certain to maintain an interest in Pakistan far longer than the average disaster attention span. My youngest sister Elizabeth is Counsel General in Peshawar so I keep an eye on the news. Sindh is still flooded. Attention to the Pakistan flood is moral imperative, a humanitarian imperative, and a security imperative. (Pakistan Flooding: A Climate Disaster, Yours truly on Chicago-based Radio Islam, Rood interview)
Here are some places that my sister has recommended for the humanitarian crisis in Pakistan. Organizations she sees.
Doctors Without Borders
The International Red Cross
MERLIN medical relief charity
U.S. State Department Recommended Charities
The mobile giving service mGive allows one to text the word "SWAT" to 50555. The text will result in a $10 donation to the UN Refugee Agency (UNHCR) Pakistan Flood Relief Effort.
Portlight Disaster Relief at Wunderground.com
Figure 1. Despair of Pakistan’s forgotten flood victims: BBC coverage of continuing flood in Pakistan