I'm a professor at U Michigan and lead a course on climate change problem solving. These articles often come from and contribute to the course.
By: Dr. Ricky Rood , 5:04 AM GMT on November 30, 2010
Stickiness and Climate Models: Open Climate Models (2)
In the previous entry I motivated the need for communities other than scientists to have access not just to the results from climate models, but to the ability to configure climate models for possible changes to the Earth’s surface and to investigate the impact of those changes. An example I used was the possibility of a project to irrigate the Sahara – a project where it was reasonable to ask how both weather and climate might be modified.
We don’t have to imagine futuristic projects like this to make the argument that more access to configurable, evaluated climate models is needed. Some might recall an entry where I was writing about managing aerosols and greenhouse gases other than carbon dioxide to control warming in the near-term. That entry had been motivated by an outstanding presentation by Professor V. Ramanathan from the University of California San Diego. (I recommend specifically this part of Ram’s web page.) He was talking about an experiment where he was going to investigate whether or not changing cook stoves in India could reduce black carbon in the atmosphere, leading to reduced warming of the planet. While Professor Ramanathan has access to climate models and access to experts to design model experiments, he is not the only interested party in the execution and the results of the model experiments. It is easy to see that all of the regional governments would be interested in their own evaluations; many non-governmental organizations would be interested, as well as corporations and citizens.
In order to get buy in from all of these entities, people will want to be able to evaluate the information and its quality. They are likely going to want to pose their own questions. If such an undertaking was to proceed under the auspices of a treaty, then it is easy to imagine a country wanting to, say, develop its own climate modeling capabilities. And, of course, we will want to evaluate whether or not any action has had the predicted effect. Finally, remember that a scientific evaluation would require that independent researchers verify the information from other researchers.
My argument, suggested in a couple of earlier entries, is that community approaches are called for because of the complexity and ultimate scale that is involved (Using Projections, Downscaling). This stands in contrast to other ways to approach this problem, for example, users forming collaborations with scientists at universities and laboratories, or a new breed of climate consultancy with the needed expertise. No doubt, these other forms of developing climate information will occur and grow; it is the way that weather information is obtained. Restating, I don’t think that the simple extension of the way we provide weather services provides what is needed for climate services.
I want to state, explicitly, that I am in no way making the statement that the community of climate scientists and the availability of climate data and climate information are, fundamentally, closed. In fact, I have argued the contrary - that by the standards of any large, complex knowledge base that I can think of - the data, the analysis, and the deliberations of the climate community are free and available (for example Trust, but Verify, Strength in Many Peers). And without exaggeration, historian Paul Edwards has studied both weather and climate science as pioneering examples of the development of data and information sharing communities - A Vast Machine). That the climate community is excessively closed is part of the political argument. If any readers are aware of good studies about openness of research communities, then please send me (directly) references. My argument is that the requirement to extend the use of climate information to uncountable application communities challenges the current notions of community.
The provision of climate models that are configurable by non-scientists, presumably non-expert communities, is difficult and controversial. I recently gave a talk on this subject at Supercomputing 2010, and the slides of my presentation are linked here. In the next few articles in this series I want to explore some of the challenges that need to be overcome if there were to be open innovation and development of climate models, some ideas on how address the challenges, and some strategies on how to think about uncertainty in climate projections.
Developing Climate Models: Some basic problems
A climate model is built from component models that represent the atmosphere, the oceans, the land surface and the Earth’s ice – the cryosphere. Each of these models is composed of sub-component models, for example, cumulus cloud models. If you were to look around at the clouds, sky, the plants, the people, the landscape, the streams, and ask the question – how do I represent these things as numbers? How do I represent how these things will change? How do I represent how these things interact with each other? If you ask these questions, then you start to appreciate what needs to be included in a climate model. The answers to these questions get written up as narratives and computer codes that in some approximate way represents both the observed behavior and how that behavior changes. This leads to hundreds of thousands lines of computer code, which represent the knowledge of hundreds of types of researchers. To bring all of this together is a big management problem. To make sure that all the pieces work together is not straightforward; there is no single prescription; it is, sometimes, arcane and artistic.
Figure 1. Components of a model of the Earth’s Climate.
Add on top of this inherent tangle of ideas and codes our history, and it only makes the problem harder. We build on existing models, which requires us to use what exists. In some cases it is safe to say that there is computer code 30 years old, written in languages that are no longer taught. It’s a little like trying to keep ancient stone buildings from falling down. This heritage code provides a stubborn inertia that inhibits change and modernization.
Then to this heritage code add to the mix the nature of the computational problem. For as long as I have been a scientist, say 30 years, weather and climate models require the largest computers available, and these supercomputers are not programmed like your Apple or your PC. I know people Putman at NASA today who are trying to scale climate models to run on more than 100,000 processors. To be clear, that is a single model requiring 100,000 processors to run in concert with each other, which is far different than having 100,000 little models running independently. (Weather fans should remember L. F. Richardson). And we cannot stop the weather forecasts and the climate assessments to build something fundamentally new; our mission requires us to keep working along with what we have.
The take away message from this little exposé is that we have a highly specialized problem, with potentially overwhelming complexity, and a long history of how we have managed to get things done. “Managed to get things done” is at the core. All of the scientists and the codes are spread all over. They are not in any formal sense, managed, and we have had to develop management strategies to help control the complexity. We have this tension between management and community and creativity.
I have managed large weather and climate modeling activities when I was at NASA. On a good day, I maintain that I managed this successfully. When I was a manager I sought control, and I grimaced at some naïve ideas of community. My experience tells me that we need to investigate new ways of model development and model use. This need arises because the complexity is too large to control, and this is especially true as we extend the need to use climate models to investigate energy policy decisions and, especially, adaptation to climate change.
In the past decade we have seen the emergence of community approaches to complex problem solving. Within these communities we see the convergence of creativity and the emergence of solution paths. We see self-organizing and self-correcting processes evolve. Counter intuitively, perhaps, we see not anarchy, but the emergence of governance in these open communities. The next entry in the series will focus more on describing open communities.
Pakistani Flood Relief Links
Doctors Without Borders
The International Red Cross
MERLIN medical relief charity
U.S. State Department Recommended Charities
The mobile giving service mGive allows one to text the word "SWAT" to 50555. The text will result in a $10 donation to the UN Refugee Agency (UNHCR) Pakistan Flood Relief Effort.
Portlight Disaster Relief at Wunderground.com
An impressive list of organizations
Comments will take a few seconds to appear.