4.1.5 Obtaining actionable climate model output - higher resolution climate information

Regional Climate Models

Regional Climate Models (RCMs) were originally developed to bridge the gap caused by a lack of computing power, which meant that GCMs could be run only at relatively coarse spatial resolutions (200-300km). While adequate for examining climate change at the global and continental scales, GCM output was generally not sufficient for studies at more regional scales. This led to the development of RCMs, which at that time, were much higher resolution climate models (typically between 10 and 50 km) operating over a limited domain, e.g., North America, Europe. Using RCMs to obtain finer-scale climate information is known as dynamical downscaling. Today, however, some GCMs/ESMs are run at even better spatial resolution than most RCMs, e.g., MIROC 4h and CMCM-CM (Tapiador et al., 2020). However, their number is still limited which prohibits constructing a high-resolution ensemble.

There are still many GCMs which are run at coarser spatial resolutions and RCMs are a means of dynamically downscaling this information to finer spatial scales. RCMs incorporate many of the same physical processes and parameterisations as GCMs, and in fact often share the same code. The increased spatial resolution, however, means that some climate processes, which are parameterised in a GCM, may be explicitly represented in the RCM. Finer spatial scales also mean that the underlying topography and land/water boundaries can be more accurately represented in an RCM.

Since they operate over limited spatial areas, they require information (e.g., pressure, temperature, wind and moisture) from a GCM at their lateral boundaries to drive the climate within the RCM. In this way, the RCM provides a physically based simulation of the climate within its boundaries which is consistent with that of its driving GCM, but at higher spatial resolution. It must be remembered though that the RCM inherits errors and biases that may be present in the driving GCM. Some of the more recent very high resolution RCMs (model resolution ≤ 4 km) are able to explicitly represent physical processes such as atmospheric convection (convection-permitting models) which can lead to much improved simulations of short-duration precipitation extremes (e.g., Kendon et al., 2017). The computing costs associated with these very high resolution models remains very high and currently limits their widespread use.

Regional climate modelling is overseen by CORDEX (Coordinated Regional Climate Downscaling Experiment), a framework implemented by the World Climate Research Program to advance and coordinate the science and application of regional climate downscaling through global partnerships. As is the case with the CMIP, this program oversees experiment design, collection and dissemination of RCM results for a number of continental-scale domains, including North America.

CMIP5 CORDEX experiments focussed on nine core model domains, including North America (NA-CORDEX), at a horizontal spatial resolution of about 45 km and about 25 km. Historical simulations used the same atmospheric forcing as the CMIP5 GCMs/ESMs, and focussed on RCP4.5 and RCP8.5 in the future simulations. The Canadian RCMs involved in NA-CORDEX are CRCM5-OUR, CRCM5-UQAM and CanRCM4, with spatial resolutions of between 0.11° and 0.44°.

For CMIP6, the CORDEX experimental protocol (Gutowski et al., 2016) recommends that RCM groups use the same historical atmospheric forcing as the CMIP6 GCMs/ESMs, but for the future simulations, starting in 2015, it recommends focussing on SSP1-2.6 and SSP3-7.0. After these experiments are complete, and if sufficient resources are available, dynamical downscaling of SSP2-4.5 and/or SSP5-8.5 is also recommended. The target horizontal spatial resolutions recommended for these experiments are 12.5 km and 25 km across 14 core domains, including North America and the Arctic.

Both CORDEX experimental protocols included an evaluation experiment using re-analysis data to drive the RCMs. For the CMIP5 RCMs, ERA-Interim was used for the 1979-2017 period, and ERA-5 for at least the 1979-2020 period for the CMIP6 RCMs.

Bias Adjustment and Downscaling

All climate models (GCMs, ESMs and RCMs) have one thing in common regardless of resolution: they exhibit some degree of bias – this means that they do not simulate the present-day climate perfectly. While the bias differs from model to model, there are some common consistencies. Bias correction, or adjustment, is particularly important when considering climate change impacts that depend on the crossing of absolute thresholds.

Different methodologies have been developed to deal with this bias, ranging from relatively simple (that apply to all variables) to very complex (that operate in most of the cases on the most popular meteorological variables). Many of the bias adjustment methodologies for meteorological variables also include a downscaling aspect, which leads to a bias-adjusted dataset at higher spatial resolution than the original GCM/ESM. One of the simplest approaches to minimize the effect of bias in the simulated present-day climate is to use the multi-model mean, generally known as the ensemble-mean, together with the 25th and 75th percentile values. Ensemble-mean values have been shown to compare best with observations (e.g., Flato et al., 2013). The interval of values defined by the percentiles of the ensemble corresponding to one emission scenario, or the uncertainty range, gives an indication of the range of possibilities for that emissions scenario. This simple method can be applied to all simulated variables.

The simplest way of addressing bias in climate models, however, is not to use absolute values, but rather the change, or anomalies, relative to the simulated historical mean (e.g., 1971-2000). The assumption here is that even if the present day climate is not perfectly simulated there is value in the simulated change (by assuming that the bias is the same in the present day and future simulations and so effectively cancels out when calculating the anomalies). The change, or anomalies, can be computed for all the simulated variables. Sometimes scenarios of future climate are obtained by adding these anomalies to the observed characteristics of spatial and temporal variability. This procedure is known as the ‘delta change’ method and requires a good estimation of historical variables over a long period of time, over the region of interest. The estimation of the historical characteristics constitutes an important issue for some variables over regions with sparse observation networks (e.g., the Canadian North).

Statistical bias adjustment methods are also available, again ranging from the relatively simple to the very complex. The more complex of these methods can correct systematic errors in model simulations of mean values, quantiles[1] of a distribution, or even the dependence between variables (e.g., Teutschbein and Seibert, 2012). However, all these methods require an observational climatology to target in the correction process. This ‘target’ dataset can be a station dataset if the bias correction is done at a local level, but in most cases a gridded historical dataset is required for corrections over regions.

The availability of an ensemble of bias corrected simulations depends on the availability of a good gridded dataset with historical observations. For Canada, bias corrected ensembles based on CMIP5 and CMIP6 simulations have been developed for maximum and minimum temperature and precipitation.

For example, researchers at the Pacific Climate Impacts Consortium (PCIC) and Environment and Climate Change Canada (ECCC) have developed a statistical methodology known as BCCAQv2 (Bias Correction with Constructed Analogues and Quantile Mapping, Version 2, Cannon et al., 2015) that has been used to bias adjust and downscale ensembles of GCM simulations for both CMIP5 and CMIP6. This method has been used to downscale maximum and minimum temperature and accumulated precipitation from the GCM resolution to a target grid of approximately 10 km for all of Canada. This method combines two separate downscaling methodologies – bias-correct climate analogues (BCCA; Maurer et al., 2010) and quantile delta mapping (QDM; Cannon et al., 2015) – and leverages strengths from each method to provide a final product that outperforms either individual method. This methodology has been extensively evaluated at the national scale using a wide range of climate indices (those identified by the Expert Team on Climate Change Detection and Indices [ETCCDI]) to determine its ability to capture the temporal sequence of events, the statistical distribution of values and also the spatial structure of values. However, there are no studies dedicated exclusively to Northern Canada.

For BCCAQv2 over Canada, the observational target dataset is the ANUSPLIN dataset, which is available at 300 arc second spatial resolution (1/12° grids, approx. 10 km) and consists of daily minimum and maximum temperature, and precipitation amount for 1950-2013 (Hopkinson et al., 2011; McKenney et al., 2011). A detailed description of the ANUSPLIN dataset is provided in annexes 7.1.15, 7.2.16, and 7.5.9.

The BCCAQv2 methodology downscales maximum and minimum temperature and precipitation independently. This assures statistical coherence but the day-to-day relationships between the corrected variables are not necessarily consistent and so should not be used to construct compound indices that require physical consistency between the input variables. In such cases, multivariate statistical downscaling which preserves the dependence between variables is necessary (e.g., Cannon, 2016).

[1] Quantiles are values that divide the range of a probability distribution into intervals with each interval containing the same fraction of the total population.