Show Summary Details

p. 464. Modelling future climatefree

  • Mark Maslin

Abstract

‘Modelling future climate’ is about understanding the fundamental physical processes of the climate system. Modelling future climate considers the carbon cycle, cooling effects, carbon emissions, and the complex three-dimensional general circulation models that examine and further our understanding of the global climate system and which are used to predict future global climate. Over 40 climate models were used in developing the IPCC projections for the 2013 report. The three main realistic carbon emissions pathways suggest the global mean surface temperature could rise by between 2.8°C and 5.4°C by 2100 and predict an increase in global mean sea level of between 52 cm and 98 cm in this timeframe.

The whole of human society operates on knowing the future weather. For example, farmers in India know when the monsoon rains will come next year and so they know when to plant the crops. Farmers in Indonesia know there are two monsoon rains each year, so next year they can have two harvests. This is based on their knowledge of the past, as the monsoons have always come at about the same time each year in living memory. But the need to predict goes deeper than this as it influences every part of our lives. Our houses, roads, railways, airports, offices, cars, trains, and so on are all designed for the local climate. For example, in England all the houses have central heating, as the outside temperature is usually below 20°C, but no air-conditioning as temperatures rarely exceed 26°C. While in Australia the opposite is true and most houses have air-conditioning but rarely central heating. Predicting future climate is, therefore, essential, as we know that we can no longer rely on records of past weather of an area to tell us what the future will hold. We have to develop new ways of predicting the future, so that we can plan our lives and so that human society can continue to function fully. So we have to model the future.

Models

There is a whole hierarchy of climate models, from relatively simple box models to extremely complex three-dimensional GCMs. Each has a role in examining and furthering our understanding of the global climate system. However, it is the complex three-dimensional general circulation models that are used to predict future global climate. These comprehensive climate models are based on physical laws represented by mathematical equations, which are solved using a three-dimensional grid over the globe. To obtain the most realistic simulations, all the major parts of the climate system must be represented in sub-models, including atmosphere, ocean, land surface (topography), cryosphere, and biosphere, as well as the processes that go on within them and between them. Most global climate models have at least some representation of each of these components. Models that couple together both the ocean and atmosphere components are called atmosphere–ocean general circulation models (AOGCMs).

Over the last 30 years there has been a huge improvement in climate models. This has been due to our increased knowledge of the climate system but also because of the nearly exponential growth in computer power. There has been a massive improvement in spatial resolution of the models from the very first IPCC report in 1990 to the latest in 2013. The current generation of AOGCMs have multiple layers in the atmosphere, land, and ocean and a spatial resolution greater than one point every 100 km by 100 km (see Figure 12). Equations are typically solved for every simulated ‘half-hour’ of a model run. Many physical processes, such as cloud and ocean convection, of course take place on a much smaller scale than the model can resolve. Therefore, the effects of small-scale processes have to be lumped together, which is referred to as ‘parameterization’. Many of these parameterizations are, however, checked with separate ‘small-scale-process models’ to validate the scaling up of these smaller p. 48

12. Generic structure of a global climate model

p. 49influences. The reason that the spatial scale is limited is that comprehensive AOGCMs are very complex and use a huge amount of computer time to run. At the moment, much of the improvement in computer processing power that has occurred over the last decade has been used to improve the representation of the global climate system by coupling more models directly into the GCMs. The very latest models or ‘climate simulators’, as some groups are now referring to them, include much better representations of atmospheric chemistry, clouds, aerosol processes, and the carbon cycle, including land vegetation feedbacks. But the biggest unknown in the models is not the physics, it is the estimation of future global GHG emissions over the next 90 years. This includes many variables, such as the global economy, global and regional population growth, development of technology, energy use and intensity, political agreements, and personal lifestyles. Hence you could produce the most complete model in the world, taking two years to simulate the next 100 years, but you would have only one prediction of the future, based on only one estimate of future emissions—which might be completely wrong. Individual models are therefore run many times with different inputs to provide a range of changes in the future. In fact, the latest IPCC science report has consulted the results of multiple runs from over 40 different AOGCMs to provide the basis for their predictions. Of course, as computer processing power continues to increase, both the representation of coupled climate systems and the spatial scale will continue to improve.

Not only does the IPCC fifth assessment include some significant improvements in the presentation of the physical processes of the climate system but also many of the models had a small increase in spatial resolution. The models also focus on decadal forecasts to help understand the internal variability of the climate to understand when the rate of warming may slow down or speed up. There are also separate chapters dealing with near-term climate up to 2050 and climate change after 2100.

Carbon cycle

At the heart of the AOGCMs is the carbon cycle and estimating what happens to anthropogenic carbon dioxide and methane emissions. As about half of all our carbon emissions are absorbed by the natural carbon cycle and do not end up in the atmosphere, but rather in the oceans and the terrestrial biosphere, the Earth’s carbon cycle is complicated, with both sources and sinks of carbon dioxide. Figure 13 shows the global carbon reservoirs in gigatonnes or 1,000 million tonnes (GtC) and fluxes (the ins and outs of carbon in GtC per year). These indicated figures show the changes since the industrial revolution. Evidence is accumulating that many of the fluxes can vary significantly from year to year.

13. The carbon cycle, in gigatonnes of carbon (GtC)

p. 51This is because in contrast to the static view conveyed in figures like this one, the carbon system is dynamic, and coupled to the climate system on seasonal, inter-annual, and decadal timescales. What has become clear is that the surface ocean and the land biosphere both take up about 25 per cent each of our carbon emission every year. However, as the oceans continue to warm they can hold less dissolved carbon dioxide, which means that their uptake will reduce. Also as we continue to deforest and substantially alter land use then the land biosphere ability to absorb carbon diminishes.

Warming and cooling effects

As well as the warming effects of the GHGs, the Earth’s climate system is complicated in that that there are also cooling effects (see Figure 14). This includes the amount of particles in the air (which are called aerosols, many of which come from human pollution such as sulphur emissions from power stations) and these have a direct effect on the amount of solar radiation that hits

14. Radiative forcings between 1750 and 2011

p. 52the Earth’s surface. Aerosols may have significant local or regional impact on temperature. In fact, the AOGCMs have now factored them into the computer simulations of climate change, and they provide an explanation of why industrial areas of the planet have not warmed as much as previously predicted. Water vapour is a GHG, but, at the same time, the upper white surface of clouds reflects solar radiation back into space. This reflection is called ‘albedo’—and clouds and ice have a high albedo, which means that they reflect large quantities of solar radiation from surfaces on Earth. Increasing aerosols in the atmosphere also increases the amount of clouds as they provide points on which the water vapour can nucleate. Predicting what will happen to the amount and types of clouds, and their warming potential, has been one of the key challenges for climate scientists.

Emission models of the future

A critical problem with trying to predict future climate is predicting the amount of carbon dioxide emissions that will be produced in the future. This will be influenced by population growth, economic growth, development, fossil-fuel usage, the rate at which we switch to alternative energy, the rate of deforestation, and the effectiveness of international agreements to cut emissions. Out of all the systems that we are trying to model into the future, humanity is by far the most complicated and unpredictable. If you want to understand the problem of predicting what will happen in the next 100 years, imagine yourself at the beginning of the 20th century and what you would have predicted the world to be like in the 21st century. At the beginning of the 20th century, the British Empire was the dominant world power due to the industrial revolution and the use of coal. Would you have predicted the switch to a global economy based on oil after the Second World War? Or the explosion of car use? Or the general availability of air travel? Even 20 years ago, it would have been difficult to predict that there would be budget airlines, allowing for cheap flights throughout Europe and the USA.

p. 53The original IPCC reports used simplistic assumption of GHG emissions over the next 100 years. From 2000 onwards the climate models used the more detailed Special Report on Emission Scenarios. The 2013 IPCC Fifth Assessment report used more sophisticated Representative Concentration Pathways (RCPs) which considered a much wider variable input to the social-economic models including population, land use, energy intensity, energy use, and regional differentiated development (see Table 2). However, the new RCPs mean that detailed comparison of the 2013 IPCC results is difficult with the IPCC 2001 and 2007 outputs, which used the SRES.

There are four main RCPs that are used, defined by the final radiative forcing achieved by the year 2100, and they range from 2.6 to 8.5 watts per square metre (W/m2). Radiative forcing is defined as the difference of sunlight (radiant energy) received by

Table 2. Defining Representative Concentration Pathway used in the IPCC 2013 report

Representative Concentration Pathway (RCP)

Description

RCP8.5

Rising radiative forcing pathway leading to 8.5 W/m2 (~1370 ppm CO2 eq*) by 2100.

RCP6

Stabilization without overshoot pathway to 6 W/m2 (~850 ppm CO2 eq) at stabilization after 2100

RCP4.5

Stabilization without overshoot pathway to 4.5 W/m2 (~650 ppm CO2 eq) at stabilization after 2100

RCP2.6 (also called RCP3PD)

Peak in radiative forcing at ~3 W/m2 (~490 ppm CO2 eq) before 2100 and then decline to 2.6 W/m2 by 2100 (~420 ppm CO2 eq).

* CO2 eq is the carbon dioxide equivalent of all the GHGs radiative forcings combined

p. 54the Earth and the energy radiated back to space. Radiative forcing is quantified at the tropopause, which is the lowest layers of the Earth’s atmosphere where all weather occurs. Its height ranges from 10 km (~6 miles) at the Poles to nearly 18 km (~11 miles) in the Tropics. Radiative forcing is measured in units of W/m2 of the Earth’s surface. A positive forcing (more incoming energy) warms the system, while negative forcing (more outgoing energy) cools it. The radiative forcing of Earth can change due to changes in insolation (incident solar radiation) and the concentrations of GHGs and aerosols. The four RCPs were selected to be representative of the three most likely emission pathways: two medium stabilization scenarios (RCP4.5/RCP6); and one business as usual baseline emission scenarios (RCP8.5). A RCP was also included to illustrate what could be achieved if every mitigation strategy was employed (RCP2.6). This pathway is also referred to as RCP3PD, a name that emphasizes the radiative forcing trajectory as it first goes to a peak forcing level of 3 W/m2 followed by a decline, the PD representing peak then decline (see Figure 15). This shows that emissions would initially increase producing a radiative forcing of 3 W/m2 and then there would be huge cut backs in emissions so that by the year 2100 a radiative forcing of only 2.6 W/ m2 was achieved. What is rarely mentioned about this RCP is that it assumes negative emissions from 2070 onwards, which means that not only does the world have to cease producing any carbon emissions by 2070 but that after this date we will actively be taking carbon dioxide and other GHGs out of the atmosphere, which is an immense undertaking.

Modelling uncertainty

In the most recent IPCC Fifth Assessment report, the RCPs were inputted into about 40 GCMs. Each of these models has their own independent design and parameterizations of key processes. The independence of each model is important, as confidence may be derived from multiple runs on different p. 55

15. Future carbon emission scenarios

models providing similar future climate predictions. In addition the differences between the models can help us to learn about their individual limitations and advantages. Within the IPCC, due to political expediency, each model and its output is assumed to be equally valid. This is despite the fact that some p. 56are known to perform better than others when tested against reality provided by the historic and palaeoclimate records. Moreover though we understand uncertainty within a single model the notion of quantifying uncertainty from many models currently lacks any real theoretical background or basis. The IPCC combines all the models used for each run and then presents the mean and the uncertainty between the models. This way it is clear that there are difference in the model output but that in general they agree and show very different futures based on which RCP we take. The uncertainties in the IPCC 2013 report are slightly higher than those in the 2007 report and this is because of our greater understanding of the processes and our ability to quantify that knowledge. So though our confidence in the climate models has increased, so has the range of possible answers for any specific GHG forcing. Dr Dan Rowlands (Oxford University) and colleagues in 2012 recently explored the amount of uncertainty inherent in complex models by running one specific climate model through nearly 10,000 simulations (as opposed to the usual handful of runs that can be usually be managed). While his average results matched well with the IPCC projections, they found that more extreme results, including warming of up to 4°C by 2050, were just as likely as the less extreme results.

The RCPs and the different GCMs are just the start of what I call the cascade of uncertainty, because these model outputs are then used in much higher resolution models to provide a better understanding of the potential impacts of climate change (see Figure 16). This is called down-scaling and is a huge problem recognized in the modelling community, because precipitation is spatially and temporally highly variable but essential to model if human impacts are to be understood. Ultimately the cascade of uncertainty leads to a huge range of potential futures at a regional level that are in some cases contradictory. For example, detailed hydrological modelling of the Mekong River Basin using climate model input from just a single GCM by the UK Met Office p. 57

16. Cascading uncertainty through climate change models and policy

(HadCM3) lead to projected future changes in annual river discharge ranging from a decrease of 5.4 per cent to an increase of 4.5 per cent. Changes in predicted monthly discharge are even more dramatic ranging from ‒16 per cent to +55 per cent. Advising policy makers becomes extremely hard when the uncertainties do not even allow one to tell if the river catchment system in the future will have more or less water.

Future global temperatures and sea level

Between 32 and 39 AOGCMs have been run for each of the RCPs for the IPCC 2013 report, to produce scenarios of global temperature and sea level changes that may occur by 2100. This is a significant change from the IPCC 2001 report, in which only seven models were used. These climate models suggest that the global surface temperature between 2016 and 2035 will rise by between 0.3°C and 0.7°C relative to the average of 1986‒2005. The global temperatures rise for the average of 2081‒2100 again relative to the average of 1986‒2005 will be heavily dependent on the RCP we follow (see Table 3). If the realistic RCPs are considered the global temperatures could rise between 1.1 and 4.8°C in the last two decades of the century (see Figure 17). With the rise of 0.8°C already, this would represent a total rise of 1.9°C to 5.6°C. An added confusion is that the IPCC Fourth Assessment report in 2007 report global temperatures at 2100 instead of an average of 2081‒2100. They reported using the best estimates for the original six emission scenarios, a range between 1.8°C and 4°C by 2100. This compares to the IPCC Fifth

Table 3. Temperature and sea level projection by Representative Concentration Pathway

Representative Concentration Pathway

Global temperature change (°C) 2081‒2100

Global sea level rise (m) 2081‒2100

RCP8.5

2.6 to 4.8 (mean 3.7)

0.45 to 0.82 (mean 0.63)

RCP6

1.4 to 3.1 (mean 2.2)

0.33 to 0.63 (mean 0.48)

RCP4.5

1.1 to 2.6 (mean 1.8)

0.32 to 0.63 (mean 0.47)

RCP2.6 or RCP3PD

0.3 to 1.7 (mean 1.0)

0.26 to 0.55 (mean 0.40)

p. 59

17. Global temperatures, Arctic sea ice, and sea level in the 21st century

Assessment Report 2013’s final mean global temperature for RCP8.5 and RCP4.5 of between 1.9°C and 4.1°C by 2100. A strikingly similar set of results.

In terms of sea-level rise, again this is dependent on which RCP we follow, but taking the three realistic ones then it will be between 0.32 m and 0.82 m in the last two decades of the century p. 60(Table 3 and Figure 17). With the rise of 20 cm, which has already occurred then this would represent a total rise of 0.52 m and 1.02 m. If we look at the final projected sea level at 2100, they show an increase in global mean sea level of between 27 cm and 98 cm. This is similar but more extreme that the projection made by the IPCC 2007 report, which suggests a sea level rise of between 28 cm and 79 cm by 2100.

Modelling extreme events

Climate change modelling has advanced so rapidly in the last decade that it can now attempt to attribute the contribution of anthropogenic climate change to extreme weather events. A few years ago this would be unheard of, and the standard communication line was that scientist could not attribute individual weather events to climate change, but the event in question may be consistent with what is expected to happen in the future. However, with increased computer power it is possible to run regional climate scenarios thousands of times with and without the contribution of anthropogenic GHGs and so assess the potential impact on the occurrence of extreme weather events. A discernable contribution of anthropogenic climate change has been found for UK floods in 2000, the Russian heat wave of 2010, and the Texan and East African droughts of 2011; while no climate change influence has been found for the floods in Thailand in 2011 or in Pakistan in 2010. This science though is still in its infancy and throws up sometimes contradictory studies due to being as yet unable to define what we mean exactly by an anthropogenic climate change contribution. This complexity is shown by two research papers published relating to the Russian heat wave of 2010; as: one concluded that climate change had not contributed to the event while the other concluded that it had. This apparent mismatch was caused by the papers asking different questions. The first study showed that climate change had had little or no effect on the magnitude of the Russian heat wave, while the second study showed that climate change had increased the p. 61frequency at which these events could occur three-fold. This demonstrates the importance of scientists and policy makers asking the right questions.

What the sceptics say

One of the best ways to summarize the perceived problems of modelling climate change is to review what the sceptics say.

(1) Clouds can have negative feedbacks on global climate which will reduce the effects of climate change

As has been the case since the very first IPCC report in 1990, the greatest uncertainty in future predictions is the role of the clouds and their interaction with radiation. Clouds can both absorb and reflect radiation, thereby cooling the surface, and absorb and emit long-wave radiation, thus warming the surface. The competition between these effects depends on a number of factors: height, thickness, and radiative properties of clouds. The radiative properties and formation and development of clouds depend on the distribution of atmospheric water vapour, water drops, ice particles, atmospheric aerosols, and cloud thickness. The physical basis of how clouds are represented or parametrized in the AOGCMs has greatly improved through the inclusion of representations of cloud microphysical properties in the cloud water budget equations. Clouds still represent a significant source of uncertainty in climate simulations. However, as Figure 14 shows, even if the most extreme cooling value is applied for clouds, the warming factors are still three times larger.

(2) Different models give different results, so how can we trust any of them?

This is a frequent response from many people not familiar with modelling, as there is a feeling that somehow science must be able to predict an exact future. However, in no other walk of life do we p. 62expect this precision. For example, you would never expect to get a perfect prediction of which horse will win a race or which football team will emerge triumphant. The truth is that none of the climate models is exactly right. But what they provide is the best estimate that we have of the future. Now this view of the future is strengthened by the use of more than one model, because each model has been developed by different groups of scientists around the world, using different assumptions and different computers, and thus they provide their own particular future prediction. What causes scientists to have confidence in the model results is that they all roughly predict the same trend in global temperature and sea level for the next 100 years. One of the great strengths of the 2013 IPCC report is that it used over 40 models, while the 2007 IPCC report used 23 international models, compared to seven in 2001. Another strength of this large-scale multiple model approach is that scientists can also give an estimation of how confident they are in the model results and also a range of possible predictions, discussed earlier. One key test of climate models is the equilibrium climate sensitivity (ECS) whereby the model predicts what the global temperature change would be if pre-industrial carbon dioxide levels were doubled. These results have been very consistent over the last 40 years (see Figure 18) and the 2013 IPCC report suggests the range is between 1.5°C and 4.5°C, which is consistent with other measures. What is even more amazing is that in 2014 the UK climate sceptical think tank the Global Warming Policy Foundation (GWPF) published its own report, which looked into ECS. They concluded that ECS was between 1.25°C and 3.0°C with a best estimate of 1.75°C. Though this is lower than the IPCC estimate it is still a major breakthrough that an organization such as GWPF is now recognizing the impacts of carbon emissions on the atmosphere.

(3) Climate models fail to predict weather.

Many people get weather and climate confused. Climate is generally defined as the average weather. The original definition of p. 63

18. Equilibrium climate sensitivity

climate was the average weather over 30 years, this has been changed because we now know that our climate is changing and significant changes have been seen every decade for the last 50 years. It is the chaotic nature of weather that can make it unpredictable beyond a few days, as the Earth’s climate system is sensitive to extremely small perturbations in initial conditions. For example, extremely slight changes in air pressure over the USA have an influence on the direction and duration of a hurricane. Though there have been amazing advances in weather prediction over the last decade, leading to storm warnings seven days in advance instead of two in the 1980s. Climate modelling is, however, much easier as you are dealing with long-term averages. A good comparison is that though it is impossible to predict at what age any particular person will die, we can say with a high degree of confidence that the average life expectancy of a person in a developed country is about 80 years. The modelling of climate is not limited in the same way as the prediction of the weather p. 64because the longer term systematic influences on the atmosphere are not reliant on the initial conditions. So the longer term trends in regional and global climate are not controlled by small-scale influences. Also, as described above we now have the computer power to go back and test whether extreme weather events were more intense or frequent due to anthropogenic climate change.

(4) Climate models fail to reconstruct or predict natural variability.

The global climate system contains cyclic variations, which occur on a decade or sub-decade timescale. The most famous is El Niño, which is a change in both ocean and atmospheric circulation in the Pacific region occurring every three to seven years that has a major influence on the rest of the global climate. Sceptics argue that climate models have been unable to simulate satisfactorily these events in the past. However, climate models have become better at reconstructing these past variations in El Niño–Southern Oscillation (ENSO), North Atlantic Oscillation (NAO), and related Arctic Oscillation (AO) as there has been an increasing realization that these have a profound impact upon regional climate (see Climate: A Very Short Introduction). Most models are able to depict these natural variations, picking out particularly the 1976 climate shift that occurred in the Pacific Ocean. All the AOGCMs have predicted outcomes for ENSO and NAO for the next 100 years. However, a lot of improvement is required before there will be confidence in the model predictions. It is, though, testament to the realism of the AOGCMs that they can indeed reconstruct and predict future trends in these short-term oscillations.

(5) Climate models cannot reconstruct past climate.

Past climates are an important test for global climate models and the IPCC Fifth Assessment report has a whole chapter dedicated to palaeoclimatology. The biggest climate shift, for which we have many palaeoclimate reconstructions, is that of the last ice age, which ended about 10,000 years ago. A comparison between p. 65palaeoclimate data for the most extreme stage of the ice age, which occurred 18,000 years ago, suggests that the global climate models are rather good. It shows that the AOGCMs used for predicting future climate can do a good job of reconstructing the extreme conditions of an ice age and can get sea level close to 120 m lower, and global temperatures 6°C cooler, with atmospheric carbon dioxide one-third lower and atmospheric methane halved. One important observation is that the models are conservative, and they systematically underestimated the extremes of the last ice age. This means we can assume that the future climate predictions are also conservative, and thus climate change is very likely to be at the top end of the estimates.

(6) What about Galactic cosmic rays (GCRs).

GCRs are high-energy particles that cause ionization in the atmosphere and it has been suggested that they could affect cloud formation. GCRs vary inversely with solar variability because of the effect of solar wind. This is an excellent example of how climate science progresses by gaining new knowledge, testing it thoroughly, and if required adding it into the climate models. However, there seems to be no correspondence with the variations of cosmic rays and the global total cloud cover since 1991 or to global low-level cloud cover since 1994. Together with the lack of a proven physical mechanism and the plausibility of other causal factors affecting changes in cloud cover, this makes the association between GCR-induced changes in aerosol and cloud formation unlikely. Some colleagues have found that the evidence showed that connections between solar variation and climate were more likely to be mediated by direct variation of insolation rather than cosmic rays, and concluded that varying solar activity, either by direct solar irradiance or by varying cosmic ray rates, would have contributed less than 0.07°C warming since 1956, in other words less than 14 per cent of the observed global warming. Therefore, a review of the recent and historical literature continues to find that the link between cosmic rays and climate is tenuous.

Summary

Modelling future climate change is about understanding the fundamental physical processes of the climate system. Four new emission scenarios were produced for the 2013 IPCC Science report using a much wider set of input to the socioeconomic models including population, land use, energy intensity, energy use, and regional differentiated development. One of these emissions pathways (RCP2.6) was developed to indicate to policy makers what could be achieved in terms of climate change if all possible mitigation strategies were employed as soon as possible. Over 40 climate models were used in developing the IPCC projections and the quantification of uncertainty, providing a huge ‘weight of evidence’. Using the three main realistic carbon emissions pathways over the next 85 years, the climate models suggest the global mean surface temperature could rise by between 2.8°C and 5.4°C by 2100. However it must remembered that global temperatures will stop changing once we get to year 2100. Figure 19 shows

19. Global surface temperatures (1950‒2300)

p. 67how temperatures could continue to rise way beyond the levels of this century depending on the chosen emission pathways. Using the three main realistic carbon emissions pathways the models also predict an increase in global mean sea level of between 52 cm and 98 cm by 2100.