Report reflects shift in climate research toward news you can use
In the effort to better understand the dynamics of the Earth’s changing climate, a recent report from the National Academy of Sciences calls for scientists to collaborate on a “new generation” of highly detailed and integrated climate change models.
According to the NAS release:
With changes in climate and weather . . . past weather data are no longer adequate predictors of future extremes. Advanced modeling capabilities could potentially provide useful predictions and projections of extreme environments.
Those “useful predictions and projections,” according to the report, could come in various forms — supplying farmers with better information about what to plant, year-to-year, say, or giving local officials greater insights into future flood risk, or providing climatologists with better information about specific parts of the country most susceptible to extreme heat.
Currently, a wide array of public and private entities – universities, cities and state governments – have embarked on their own climate models. In many cases, according to the NAS committee, this has resulted in a duplication of effort.
To encourage greater sharing and collaboration, the study proposes “a framework in which software, data standards and tools, and even model components are shared by all major modeling groups across the country.” The committee, made up of climate scientists from across the country, has also called for adopting “a common software infrastructure.”
In some ways, the NAS report mirrors some recent efforts here in California, where agencies and researchers have attempted to stitch large, disparate swaths of information into single, comprehensive reports – “news you can use,” as it were. A study released in July by the California Energy Commission, entitled “California’s Changing Climate,” offers up a portfolio of studies on a wide range of topics, from the effects of higher average temperatures on crop yields to the changing distribution and frequency of fires in a warmer climate.
According to Chris Bretherton, an atmospheric science professor at the University of Washington and chair of the NAS committee, this infrastructure can be thought of as a new open-source operating system, designed to tie together various regional models together into a larger — even global — framework.
[module align=”left” width=”half” type=”pull-quote”]“The idea is to share the same key models and to have them talk to one another.”[/module]
“It’s a software engineering issue,” Bretherton told me. “The idea is to share the same key models and to have them talk to one another.” The more fine-grained and interconnected the climate models become, says Bretherton, the more powerful they will be in terms of guiding policy and decision-making.
But do such systems yet exist? Or are they, like the future climate events they aim to predict, merely hypothetical? Bretherton says there are some successful blueprints, pointing to an effort called the Community Earth System Model, an integrated climate model developed in the 1980s by the National Center for Atmospheric Research.
One California study released earlier in the year, dubbed C-Change.LA, seems to fit the mold of these “next-generation” regional models — the kind that one day could be folded into larger, more comprehensive ones. Touted by its developers as “the most sophisticated regional study ever developed,” C-Change-LA offers a neighborhood-scale assessment of future climate conditions between the 2041 and 2060 for the Los Angeles Metro Area.