FAQs

Informing Decisions with Time of Emergence Information

Factors Affecting Projected Time of Emergence

Interpreting Results

Use of Products

Methods

Science and Modeling


Informing Decisions with Time of Emergence Information

Climate change projections are already available for many variables. What added value is provided by Time of Emergence analysis?

A key input to deciding and prioritizing climate change response actions is information about when, where, and which climate conditions are projected to change in ways that matter to management and operations. Although this information could be gleaned from existing climate change scenarios, it has not been explicitly characterized and most climate change projections are reported with no information about when future conditions are likely to be different from the range of conditions experienced in the past. The Time of Emergence analysis is designed to facilitate this understanding.

 

How can I use information about Time of Emergence?

Results provided here help indicate the priority areas, in both geographic and scientific terms, that may warrant further investigation and/or consideration for climate change response actions. 


Learn more

Time of Emergence analysis identifies, in relative terms, which climate conditions are projected to change first and where. This information can indicate which systems, operational choices, or planning decisions may warrant adjustment in response to a changing climate, where, and when.

The identified locations or variables of concern to operators and managers mapped using this tool could be combined with information on local sensitivities, design standards or critical thresholds to provide an indication of the potential timeframe for the effects of climate change to become noticeable. Such knowledge, together with information about the time required to implement response actions, could guide the prioritization of climate change research/responses for different locations across the Pacific Northwest.

For example, after comparing Time of Emergence results for multiple variables in a single location, the user may choose to focus attention on the system or decision affected by the variable with the earliest projected emergence.

For systems or decisions affected by variables with results indicating “no emergence”, one option is to implement adaptive measures that include regular reviews so that new science could be incorporated as it becomes available.

After examining how the projected Time of Emergence for a specific variable differs for different locations, the user may prioritize efforts in the locations where emergence is projected to occur earlier.

The Time of Emergence results may be combined with information about the length of time needed for developing a climate change response, to prioritize the system(s) or decision(s) where more urgent actions may be required.


 

Does this tool provide all the information I need to prepare for climate risks?

No. Although information about the Time of Emergence of decision-relevant climate change could be useful for screening or reference purposes, it does not provide all the answers.

A careful decision-maker seeking to identify when and where preparatory actions are called for will combine information the following in their assessment:

  • The Time of Emergence of climate change (earlier emergence implies greater urgency).
  • The sensitivity or vulnerability of the system or decision at hand to changes in conditions (higher sensitivity implies greater urgency).
  • The risk tolerance of your operation/system (low risk tolerance implies greater urgency).
  • The amount of time required to effectively implement response actions (longer lead-time implies greater urgency).

This tool provides only information about the Time of Emergence. The user must supply the rest.

 

Are other climate change decision-support tools available?

Yes, this tool is one of many tools available designed to support climate change decision-making; robust decisions should not be made based on one source of information. However, this is the only tool for results from a Time of Emergence analysis.

 

Why are some variables (e.g., stream temperature) not included in the analysis?

Selection criteria for the identification of variables included: stakeholder interest, key regional climate vulnerabilities, data availability, and suitability for Time of Emergence analysis.

See Climate Variables

 

Factors Affecting Projected Time of Emergence

Why are there different rates of climate change?

The different rates of climate change arise from our inability to accurately estimate the true climate change signal using climate model projections. This uncertainty is associated with the Time of Emergence calculation, and is represented by a range of values at the 90% confidence level in this tool. 


Learn more

Confidence intervals:

Although in principle, it is possible to look at results at very high (or very low) confidence intervals, this is not advisable as our confidence in the climate projections decreases as we go towards the extremes (known as tails) of the distribution. One should be cautious when using estimates associated with confidence intervals near 0% and 100%, as these levels of certainty are virtually nonexistent in climate projections, due to the fluid and chaotic nature of the ocean and the atmosphere.

Different rates of climate change:

Using different simulations (with different model parameters) from the same global climate model is likely to yield different projections, thus different climate change signals. Since this Time of Emergence analysis involves extracting the signal based on only one simulation for each global climate model, the error of the computed slope – the standard deviation of the error in the sample slope with respect to the true slope of the linear signal – is accounted for by considering the confidence intervals in the computed slope.

Limited computational power means that models can only represent physical processes and their interactions up to a certain spatial and temporal scales. Representing processes at finer scales than the model grid cells (e.g., formation of clouds, precipitation) is called parameterization, and depends on the technique applied.  Experiments based on perturbed physics ensembles (PPE) are often used to explore the effects of different choices of various model parameters, known as parameter uncertainties.

The 5th and 95th percentile confidence interval is computed as the standardized error from the LSR calculation, which is multiplied by the Student’s t-value at the 95% significance level, to obtain the 95% error term. Adding and subtracting this error term from the LSR calculated slope give an upper and lower bound to the signal. In other words, there is 90% confidence that the true signal lies between the upper and lower bounds. This could be interpreted as:

We are 90% confident that for any one global climate model applied, the true Time of Emergence for a particular variable could occur between year X and year Y.

In other words:

The Time of Emergence projected by each global climate model implies a 90% confidence that future conditions are projected to be substantially different from that between 1950 and 1999, within the range of years indicated.

Because this analysis has adopted a multi-model approach (in an attempt to capture some of the model structure uncertainty), the median value of the multi-model ensemble of the estimated slope was taken. The interpretation of the three metrics is therefore:

  • Faster: We are 90% confident that the indicated date is the earliest time when half of the global climate models project future conditions to deviate from the 1950–1999 conditions.

  • Moderate: We are 90% confident that the indicated date is the average time when half of the global climate models project future conditions to deviate, from the 1950–1999 conditions.

  • Slower: We are 90% confident that the indicated date is the latest time when half of the global climate models project future conditions to deviate, from the 1950–1999 conditions.


 

Which emission scenario, management sensitivity, rate of climate change or model agreement should I choose?

All climate projections contain uncertainties.  Therefore, when planning a climate change response, instead of attempting to choose one “best” or “most likely” future scenario to plan for, it is advisable to understand the range of potential future outcomes and develop responses that account for this range. One way to do this is to develop measures that are robust across a range of emission scenarios, or GCM agreement. Your choices would depend on the risk tolerance of your decision or operation.  

See Exploring Uncertainties

 

Why does this tool provide more than one Time of Emergence for a given variable/location?

The range of results presented in this tool reflect the influence of some, but not all, of the uncertainties associated with the projected future climatic and hydrologic conditions, namely:

  • Climate modeling uncertainty (known as structural uncertainty) – addressed by using climate projections from multiple global climate models
  • Future emissions uncertainty (known as forcing uncertainty) – addressed by using climate projections based on multiple emission scenarios
  • Downscaling uncertainty – addressed by using datasets based on more than one downscaling approach

 

Are all the modeling uncertainties accounted for in this Time of Emergence analysis?

No. Some uncertainties that cannot currently be explored using this tool include:

  • Parameter uncertainties – due to incomplete or imprecise knowledge of the actual values of the parameters that are used in global climate and regional hydrologic models to represent important processes (e.g., feedbacks associated with the carbon cycle, the sulfur cycle, and some ocean transport processes). Parameter uncertainty could be systematically assessed by using climate projections originating from variants of a given model with different plausible ranges of parameter values, known as a perturbed physics ensemble (PPE). However, this is currently not feasible due to the computational requirements so only one model variant of each global climate model is used here.
  • Model uncertainty associated with (1) dynamical downscaling (currently simulated by WRF only) and (2) hydrologic simulations (currently simulated by VIC only).
  • Natural climate variability, e.g., the inherent variations in climate on all temporal and spatial scales beyond that of an individual weather event due to natural internal processes within the climate system, as well as influences from volcanic eruptions and solar activities. This is difficult to estimate because the short instrumental records and difficulties in reconstructing past climate conditions using paleoclimatology proxies imply an incomplete description of variability on decadal and longer time scales.

 

Interpreting Results

How should I interpret the maps?

Maps provided here are intended to be visual aids for demonstrating the spatial variations in the results. They should not, however, be interpreted as a precise forecast of future conditions due to the underlying assumptions and uncertainties associated with the methodology applied.

As each map represents only one single plausible future based on the selected input parameters, looking at results from a single map could be misleading. Instead, multiple maps should be examined and compared, e.g., by comparing how the maps differ under different emissions scenarios.

Users should be cautious that for the dynamically-downscaled WRF results, only one global climate model has been used to generate the Time of Emergence projections.

 

Are the GCMs used weighted when generating ensemble results?

GCMs are not weighted, and are considered equally credible and plausible (note: but not equally likely).


Learn more

Some suggest that “better” GCMs (based on how well they can reproduce the past climate) should be given more weight in the statistical analysis of results from a group (ensemble) of GCMs. This has not been done here. All GCMs are weighted the same because: (1) a model that does well simulating past is not guaranteed to perform equally well simulating future conditions, especially when one needs to wait a long time for sufficient observations to evaluate future simulations; and, (2) whether or not a GCM does “well” at simulating past climate depends on which aspect of past climate one is evaluating its performance on. The same model can be ranked high or low, depending on the specific metric or criteria used for evaluation.

Furthermore, even for analyses that rank and weight models, there is no universally agreed metric for separating “good” from “bad” models. 


 

Why are the results from a single projection (e.g., a single climate model run with a specific emission scenario) not available?

Results from a single projection show only a single trajectory of how future climate might unfold, which overlooks the full range of possible outcomes and could therefore be misleading. Users should recognize that there is no way to scientifically determine the single most accurate projection of future conditions; using different emission scenarios, models etc. will produce different projections of future changes. Because the “best” projection depends on user-specific context and intended use, we provide a range of projections.

Guidance:

If you are trying to illustrate possible future outcomes, we would recommend you use multiple projections to show the range of results, e.g., maps of the faster and slower rate of climate change as well as the central tendency for a particular emissions scenario, or across multiple emissions scenarios, levels of GCM agreement etc.

If a single map (e.g., a single emission scenario) is used, it should be clearly stated which has been used and why, and recognized that this choice will constrain the range of future climate outcomes that are considered.

 

Why is the multi-model average value represented by the median rather than the mean?

Compared to the mean, the median is considered to be a more robust indicator of the central tendency of the projected future climate as it is less susceptible to outlier values (the exceptionally large or small values in the data).

 

Why are the most extreme values of the ensemble not provided in the results?

Results at the 25th and 75th percentiles are considered to be a more robust measure of the projection range than the maximum and minimum data points as this range captures 50% of the global climate model results.

Results could also be presented at other percentiles, e.g., the 20th and 80th percentiles. However, very high or very low projections (known as the tails of the distribution, e.g., the 5th and 95th percentiles) may not be considered robust in decision-making as not all potential futures are modeled.  This means that the maximum/minimum data points do not represent the best/worst case scenario in reality.

 

What is “GCM agreement”?

“GCM agreement” indicates the percentage of global climate models (GCMs) examined that agree with a particular finding – e.g., an 80% GCM agreement denotes that 17 out of the 21 GCMs examined indicate a particular pattern of change in future (for hydrologic results where only 6 GCMs were available, 80% GCM agreement indicates 5 out of the 6 GCMs examined indicate the same pattern of change). However, it gives no information on the likelihood, or probability, of something occurring in future.


Learn more

“GCM agreement” gives users an indication of the strength of evidence based on the GCMs applied in this analysis.

Note: Users should not be over-optimistic about consensus estimates – while random errors may cancel (on the assumption that GCM projections are equally credible and independent – which may not be true), systematic errors associated with limited knowledge and misrepresented/unresolved processes will not improve with more models of similar quality. Therefore, “GCM agreement” does not necessarily imply the relative likelihood of future climate outcomes in the context of risk-based decision-making.

Also, “GCM agreement” is conditional upon the emission scenarios and GCMs applied in any particular analysis – i.e., analysis using different emission scenarios and/or GCMs not included here may yield different results.


 

Does the “percentage of GCM agreement” translate to the “probability”, of something occurring in future?

No. GCM agreement provided here gives no information on the probability of a projection actually occurring in the future. It merely states the level of consensus in the projected direction of change based on the GCMs applied in this analysis.

For example: A projected Time of Emergence of 2060, with 75% model agreement, indicates that 75% of the global climate models examined project future conditions to deviate, from 1950–1999 conditions, by 2060, and 25% of the model simulations indicate future conditions to deviate sometime after that year.

A 75% GCM agreement is not indicating that 75% of the underlying global climate models project future conditions to deviate from those experienced in 1950–1999 in 2060.

 

Why is the “probability”, of something occurring in future not estimated?

“Probability” generally indicates the expected frequency of occurrence of some outcome, over a large number of independent trials carried out under the same conditions, e.g., when rolling a dice, there is a 1 in 6 chance (a probability of about 17%) of getting any specific number, for example, a four. Since there will be only one realization of future climate, a probability value cannot be derived through this experimental approach.

Estimating probabilities is problematic also because the projections explored in this Time of Emergence analysis do not span the entire range of possible futures.

 

Why are the results simply presented as “GCM agreement”, and why have no other more sophisticated statistical methods have been applied?

“GCM agreement” offers transparency in the results as this model count approach honestly describes the current science. Unlike other statistical methodologies such as a Bayesian approach, “GCM agreement” does not involve extrapolating the “evidence” (observations and/or GCM projections) beyond the available data points, which is an additional source of uncertainty, and thus avoids over-interpretation of the results.


Learn more

Some studies have used a more sophisticated Bayesian statistical framework to derive a Bayesian probability. This probability reflects the degree to which a particular level of future climate change is consistent with the evidence, i.e., the observations and multiple GCM simulations (all with their associated uncertainties) used in the analysis.

This Bayesian methodology involves obtaining the prior distribution from a large ensemble of model simulations. This prior is adjusted using weightings based on how well the simulated historic climates match the observations, which is then used to estimate the final (posterior) probabilistic distribution.

Correct Interpretation of this Bayesian methodology: There is 10% certainty (based on data, current understanding and chosen methodology) that the temperature rise will be less than 35.6°F (2.1°C).

Incorrect Interpretation of this methodology: The temperature rise will be less than 2.1°C in 10% of future climates. (Note: there will only be one future climate.)

Why is the Bayesian methodology not applied here?

The Bayesian approach assumes that more consideration could be given to scenarios that are more consistent with the evidence. This assumption would be less of an issue in weather forecasting as model skill could be tested and validated due to the short timeframe.  However, long-term climate projections cannot be approached using the Bayesian approach for several reasons:

  1. Calibration of the prior is almost impossible due to the lack of out-of-sample observations because of the time scales of interest and the lifetime of our models – one would have to wait for decades to test how far the simulated climates deviate from reality.
  2. This would be subjective, given that how model performance should be assessed remains questionable (e.g., model performance may vary with the metrics used for evaluation).
  3. The probabilistic projections derived from the Bayesian approach are inevitably uncertain, as they are conditional upon the information used and how the methodology is formulated. There would also be the problem of extrapolation – even if the adjusted prior was somehow known to be reliable for the twentieth century, there would be no demonstrable reliability for the twenty-first century.

 

Given the uncertainties in the underlying climate projections, are projections of the Time of Emergence of climate change still useful for planning purposes?

The Time of Emergence results available in this tool on their own are not a planning tool, but, combined with other information on other issues, they can be useful as a guide to make informed decisions.

Each projected Time of Emergence result is of value as it presents a “what if” scenario, hence we provide a range of results, rather than single “best estimate” values. This together with the ability to assess uncertainty, allow users to select the level of confidence most appropriate for their particular analytical or decision context. The current ensembles represent a lower bound on the maximum range of uncertainty. It also emphasizes the need of a robust and flexible approach to dealing with a range of possible future climates.

 

Why is the exact year and location (at the grid cell scale) of the Time of Emergence not provided?

Accurate forecasts of future conditions are almost impossible due to incomplete knowledge of how the earth-atmosphere system would respond to human-induced emissions of greenhouse gases and uncertainty over exact amount and timing of future greenhouse gas emissions. Results presented in this tool are intended to indicate which places are likely to experience changes before or after others, and which variables are likely to change before or after other, i.e., they are intended to illustrate (note: not predict) which places (e.g., western vs. eastern Washington) could experience noticeable changes before or after others, and for which climate conditions (e.g., temperature extreme vs. precipitation extreme) operators and managers may need to carry out further investigation, prioritize their planning, response actions and/or resource allocation.

Because this analysis is not designed to give information on the exact location nor timing of the emergence of noticeable change, results are aggregated to sub-regional (e.g., county or watershed) averages, and periods of the 21st century, which are presented as “the estimated time by which future conditions are projected to deviate from those in the 1950-1999”.

 

What are some of the main assumptions associated with the Time of Emergence projections presented in this tool?

Assumptions are inevitable in all climate modeling approaches due to incomplete knowledge of the earth-atmosphere system and its response to human-induced emissions of greenhouse gases.  These assumptions contribute to the uncertainties associated with the Time of Emergence results, i.e. different sets of assumptions in global climate modeling are likely to lead to different Time of Emergence results. It is therefore important to recognize and acknowledge the limitations of the methodology adopted here to avoid over-confidence or over-interpretation in the results.  It should also be noted that using another methodology could produce results that are different from those presented in this tool.

Some of the assumptions associated with this Time of Emergence analysis include:

  • Known sources of uncertainty not examined in the current modeling and statistical framework  (e.g., parameter uncertainty) are assumed to be relatively less important, or because they remain too poorly understood to be included in a credible way.
  • It was assumed that structural error (the difference between the real world and model projections) could be represented by the climate modeling uncertainty across the GCMs applied in the analysis. However, common systematic biases in all GCMs imply that the discrepancy between the real and modeled worlds may not be fully accounted for. Therefore Time of Emergence could be earlier or later than indicated by the range of results presented here.
  • Results from all the GCMs used were assumed to be equally credible. This is largely due to the practical and conceptual challenges in quantifying the quality of single runs of different GCMs and remains common practice among users of GCM results.

 

Use of Products

How do I acknowledge use of the results, data and products available from this tool?

You are welcome to use images and data from this website as long as they are properly referenced. If they are from one of the science reports then please reference the specific publication.

Reproduction of images and data from the tool should be acknowledged with the following text:

Snover, A.K., E.P. Salathé, C. Lynch, and R.M.S. Yu. 2015. Time of Emergence of Climate Change Signals in the Puget Sound Basin. Climate Impacts Group, University of Washington, Seattle, WA.

 

Methods

What is the baseline period for defining the management sensitivity (“noise”) component in this analysis?

This analysis uses a baseline period of 1950–1999. So, Time of Emergence indicates when climate change is projected to cause climate conditions to noticeably deviate from the 1950–1999 conditions.

Example 1: Emergence of the climate change signal for “Temperature, July” is projected to occur by 2020.

Interpretation:  By 2020, the average temperature for the month of July is projected to be substantially different from that between 1950 and 1999, as a result of climate change; based on specific assumptions about future emissions, management sensitivity to change, projections from a particular suite of global climate models, and a specific downscaling method.

Example 2: Emergence of the climate change signal for “Number of days with 24-hour precipitation exceeding 2 inches, annual” is projected to occur by 2065.

Interpretation:  By 2065, the number of times per year that total 24-hour precipitation exceeds two inches is projected to be substantially different from the frequency during 1950-1999, due to climate change; based on specific assumptions about future emissions, management sensitivity to change, projections from a particular suite of global climate models, and a specific downscaling method.


Learn more

Disruptions may occur when future climate conditions move beyond the coping range of various human and natural systems, and, from the management perspective, this is when actions may be needed. When evaluating the Time of Emergence of management-relevant changes in conditions due to climate change, therefore, it is important to understand the current coping range of your system. This could be defined as the range of historical conditions your system was designed to accommodate or could be characterized based on the types of conditions shown to trigger negative impacts.

The specific baseline period used in this analysis was chosen based on the assumption that management related to operations and planning of large infrastructure tends to be “generally” adapted to past climate fluctuations over a relatively long period. The 1950–1999 period is considered to be reasonably long enough to sample a range of multi-year events (e.g., a few phases of Pacific Decadal Oscillation, PDO) related to natural climate variability, as well as some effects of human-induced climate change. Use of a different baseline period for analysis could result in different Time of Emergence results.


 

How is the climate change signal defined in this analysis?

In this analysis, the climate change signal for a given variable, downscaled global climate model and emissions scenario is derived by adding the linear best fit of simulated conditions for 2006–2100 (annual trend) to the simulated climatological average for the period 1980–2010. 

Year 1 = Climatological average

Year 2 = Climatological average + Annual Trend

Year 3 = Year 2 + Annual Trend

Year 4 = Year 3 + Annual Trend

etc.

The method for calculating the signal is based on Hawkins and Sutton (2012).

Hawkins, E., and R. Sutton, 2012: Time of emergence of climate signals, Geophys. Res. Lett., 39, L01702.

 

Why was the signal threshold method chosen instead of other approaches?

The signal threshold method is used here because:

  • The method is robust for a wide variety of variables.
  • The method is more readily applicable to analyses of management-relevant variables – the emergence threshold can be easily adapted for different contexts. For some variables, the thresholds are already established, e.g., levels of ground level ozone are related to the variable “Number of days where daily maximum temperature (Tmax) is greater than 30° C”.

 

Science and Modeling

What is a climate projection, or a climate simulation?

A climate projection (or simulation) is the simulated response of the climate system to changing atmospheric levels of greenhouse gases and aerosols and in this tool, based on a collection (known as an ensemble) of GCM runs of equal weighting.

Any climate projection for a particular location and future time period results from the specific global climate model used and assumptions made about future emissions.  

 

What is the difference between a “climate projection“ and a “climate prediction”?

A climate prediction/ forecast is the most probable outcome of future climatic developments.

Climate projections are scenarios that describe alternative possible, plausible, internally consistent, but not necessarily equally probable futures. Projections are therefore, technically, conditional predictions, i.e., resulting from and tied to a specific set of assumptions made about the future evolution of greenhouse gases, etc.


Learn more

According to IPCC AR5 (2013), “a climate prediction or climate forecast is the result of an attempt to produce (starting from a particular state of the climate system) an estimate of the actual evolution of the climate in the future, for example, at seasonal, interannual or decadal time scales. Because the future evolution of the climate system may be highly sensitive to initial conditions, such predictions are usually probabilistic in nature.

A climate projection differs from a climate prediction as it depends on the emission/concentration/radiative forcing scenario used, i.e., projections are conditional on assumptions about future socioeconomic and technological developments etc.

Predictability of the climate system is limited because:

  • The imperfect knowledge of past and current states of the climate system, and their imperfect representation in GCMs.

  • The climate system is inherently non-linear and chaotic.


 

What is an emission scenario?

An emission scenario describes plausible future atmospheric amounts of greenhouse gases and other substances (e.g., aerosols and aerosol precursors) that influence global climate. This information is necessary for projecting future climate. Because it is impossible to accurately predict the conditions that will determine future greenhouse gas in the atmosphere – that is, future population and land-use changes, future patterns of socio-economic and technological development, energy policy, etc., - scientists use multiple emission scenarios to indicate alternative plausible pathways of how the future might unfold.

Scenarios are not intended to predict the future. Actual future greenhouse gas amounts could be higher or lower than those suggested by the emission scenarios applied here.

 

What is a climate model?

A climate model is a complex computer program used to simulate the interactions of the atmosphere, oceans, land surface, and ice. Scientists also use it to conduct experiments that are not possible in the real world, e.g., to project future climate resulting from increases in atmospheric concentrations of greenhouse gases.

The extent to which a climate model can simulate the response of the climate system correlates with the level of understanding of the physical, geophysical, chemical and biological processes that govern the climate system, as well as the ability to represent these processes in the model. 


Learn more

Scientists use climate models to simplified mathematical representation of the climate system comprised of differential equations based on the basic laws of physics, fluid motion, chemistry, and biological properties of components of the climate system, their interactions and feedbacks between them. These equations are the basis for complex computer programs commonly used for simulating the atmosphere or oceans of the Earth.

Computational limitations means that it is not possible to model each point in space and time so a climate model breaks the globe into a grid. Climate models generate simulations at the coarser global or finer regional scale. The former involves using a global climate model, the latter relies on a regional climate model.

 

Global Climate Models (GCMs)

Global climate models (GCMs), also known as Global Circulation Models or Coupled Atmosphere-Ocean Global GCMs (AOGCMs), are comprehensive three-dimensional models that solve differential equations for fluid motion and global energy transfer, which are then integrated forward in time. They simulate both atmospheric and ocean processes and interactions between them, as the ocean has significant influence on atmospheric processes. Computational limitations mean that the majority of sub-grid scale processes are parameterized.

 

Regional Climate Models (RCMs)

Computational constraints prohibit the increase in GCM resolution to the level necessary to include all details of local topography and other factors that determine local climate. To provide information more relevant for decision-making, regional climate models (RCMs) have been developed for simulating climate at a higher resolution for a specific geographic area. RCMs are more capable of simulating climate processes and feedbacks operating at the regional scale due to better representation of local geography.

 

Note: The increased detail of regional climate models does not necessarily result in projections of greater confidence or certainty. Regional simulations inherit the uncertainties from the parent global climate model(s), since it is the output of those global models that are used as input to a regional climate model. The process of regional climate modeling (i.e., “dynamical downscaling”) adds uncertainty to the projections, since, despite their finer resolution, regional climate models do not represent all the important physical processes shaping local climate.


 

What is downscaling?

Global climate models (GCMs) simulate future climate for the globe at a resolution (1-3 degrees) too coarse to account for differences in local conditions or to provide useful information for regional- or local-scale decision-making. Downscaling is the process of increasing the spatial and/or temporal resolution of climate projections to scales more relevant to regional or local decision making.

 

Note: The increased detail of downscaled climate projections should not be interpreted as necessarily providing greater confidence or certainty. Downscaled projections inherit the uncertainties from the parent global climate model(s), since it is the output of those global models that are used as input for the process of downscaling. Downscaling adds uncertainty to the projections.


Learn more

 

Dynamical Downscaling

This involves nesting a finer-scale regional climate model within the coarser-resolution global climate model. These results also inherent the uncertainties from the parent global climate model(s) as they are driven by the boundary conditions of the GCM. Although errors in the GCMs are retained and potentially amplified by the downscaling process, RCMs do contain more topographical detail, which may imply more realistic projections.

 

Statistical Downscaling

This involves applying statistical (i.e., empirical) relationships identified in the observations between larger and smaller-scale conditions to GCM output in a targeted area. Statistical downscaling assumes that the observed relationships will remain valid in a future warmer climate, i.e., the stationarity assumption. 


 

What is a multi-model ensemble?

A multi-model ensemble is a group of climate projections created by different global climate models.

Different climate modeling groups around the world represent climate processes in different ways in their models due to an incomplete understanding of the response of the climate system to changes in atmospheric concentrations of greenhouse gases. This leads to the differences in the climate projections, and multiple climate models are often used to address this source of climate model uncertainty, known as structural error.

Depending on data availability, the multi-model ensemble projections provided in this tool are based on statistical analysis of output from up to 21 global climate models that have taken part in international model comparisons (CMIP3 and/or CMIP5).

Note:

  1. The frequency distributions across the ensemble of models are unlikely to relate to the probability of real-world behavior, despite their value in model development. This is because while a multi-model ensemble approach may capture some of the structural uncertainty and implicitly imply improved skill and reliability, the GCMs may not be truly independent, thus does not necessarily reduce the uncertainty. For simplicity, all GCMs are assumed to be independent and of equal credibility.
  2. A multi-model ensemble approach (used here) attempts to address the structural errors between different GCMs while a perturbed physics ensemble (not available here) emphasizes parameter errors within a single model configuration.

 

What is uncertainty in climate projections?

Uncertainty refers to a state of having limited knowledge, due to a lack of information, or from disagreement over what is known or even knowable. Uncertainty can be represented by quantitative measures or by qualitative statements. This tool only enables users to explore some of the uncertainties (see READ: Exploring Uncertainties).

Uncertainty in climate projections arises from three main sources: (1) natural climate variability, (2) model uncertainty, and (3) forcing uncertainty. Because these sources of uncertainty cannot be eliminated, it is impossible to precisely predict future climate. Instead, scientists use a variety of approaches to indicate the plausible range of future conditions, as described in the table below.

Source of Uncertainty Associated with Partially addressed by using
Natural climate variability The natural fluctuations that arise in the absence of any radiative forcing of the planet, which includes both external influences on the climate (e.g., solar activities and volcanic eruptions) and internal chaotic climate processes. Multiple multi-century simulations of unforced climate using a single global climate model with identical external forcing but each simulation starting with slightly different initial conditions.
Climate model An incomplete understanding of Earth system processes and their imperfect representation in global climate models. Multiple global climate models to indicate the combined effects of natural climate variability and human-caused climate change.
Forcing Things in the future that are considered outside the climate system per se, yet affect it, e.g. not knowing the amount and timing of future global greenhouse gas emissions. Multiple emission scenarios to indicate a plausible range of emissions and atmospheric concentrations of greenhouse gas.

 

This Time of Emergence tool was designed to emphasize that climate change projections are most appropriately understood as a range of plausible future conditions, and that best practices for climate change preparation and adaptation involve identifying and considering this plausible range.

 

What is modeling uncertainty?

Modeling uncertainty refers to an incomplete understanding of Earth system processes and their imperfect representation in climate or climate-impact computer models that are based on a mixture of theory, observations, experimentation and expert judgment.

Modeling uncertainty leads to different projections of future changes for the same amount and timing of greenhouse gas emissions, and consists of two components:

  1. Structural error: Different modeling groups represent physical processes in different ways in their models because processes to be included and their parameterization may be subjectively chosen based on expert knowledge and experience.
  2. Parameter error: Not every physical process in the Earth system can be accurately described in a model, often because these processes operate at scales too fine to be explicitly incorporated into a model. Consequently, these processes are represented by parameters, the values of which may be subjectively assigned.

 

Continue Reading:

 

 


The Time of Emergence project was conceived and funded by U.S. Army Corps of Engineers Climate Preparedness & Resilience programs & U.S. Environmental Protection Agency-Region 10. Methodologies and stakeholder engagement were developed and implemented by the University of Washington's Climate Impacts Group. The Time of Emergence online tool was developed with support from the Center for Data Science, University of Washington-Tacoma.