scholarly journals Comparing proxy and model estimates of hydroclimate variability and change over the Common Era

Author(s):  
Jason E. Smerdon ◽  
Jürg Luterbacher ◽  
Steven J. Phipps ◽  
Kevin J. Anchukaitis ◽  
Toby Ault ◽  
...  

Abstract. Water availability is fundamental to societies and ecosystems, but our understanding of variations in hydroclimate (including extreme events, flooding, and decadal periods of drought) is limited because of a paucity of modern instrumental observations that are distributed unevenly across the globe and only span parts of the 20th and 21st centuries. Such data coverage is insufficient for characterizing hydroclimate and its associated dynamics because of its multidecadal-to-centennial variability and highly regionalized spatial signature. High-resolution (seasonal to decadal) hydroclimatic proxies that span all or parts of the Common Era (CE) and paleoclimate model simulations are therefore important tools for augmenting our understanding of hydroclimate variability. In particular, the comparison of the two sources of information is critical for addressing the uncertainties and limitations of both, while enriching each of their interpretations. We review the principal proxy data available for hydroclimatic reconstructions over the CE and highlight contemporary understanding of how these proxies are interpreted as hydroclimate indicators. We also review the available last-millennium simulations from fully-coupled climate models and discuss several outstanding challenges associated with simulating hydroclimate variability and change over the CE. A specific review of simulated hydroclimatic changes forced by volcanic events is provided, as well as a discussion of expected improvements in estimated forcings, models and their implementation in the future. Our review of hydroclimatic proxies and last-millennium model simulations is used as the basis for articulating a variety of considerations and best practices for how to perform proxy-model comparisons of CE hydroclimate. This discussion provides a framework for how best to evaluate hydroclimate variability and its associated dynamics using these comparisons, as well as how they can better inform interpretations of both proxy data and model simulations. We subsequently explore means of using proxy-model comparisons to better constrain and characterize future hydroclimate risks. This is explored specifically in the context of several examples that demonstrate how proxy-model comparisons can be used to quantitatively constrain future hydroclimatic risks as estimated from climate model projections.

2017 ◽  
Vol 13 (12) ◽  
pp. 1851-1900 ◽  
Author(s):  

Abstract. Water availability is fundamental to societies and ecosystems, but our understanding of variations in hydroclimate (including extreme events, flooding, and decadal periods of drought) is limited because of a paucity of modern instrumental observations that are distributed unevenly across the globe and only span parts of the 20th and 21st centuries. Such data coverage is insufficient for characterizing hydroclimate and its associated dynamics because of its multidecadal to centennial variability and highly regionalized spatial signature. High-resolution (seasonal to decadal) hydroclimatic proxies that span all or parts of the Common Era (CE) and paleoclimate simulations from climate models are therefore important tools for augmenting our understanding of hydroclimate variability. In particular, the comparison of the two sources of information is critical for addressing the uncertainties and limitations of both while enriching each of their interpretations. We review the principal proxy data available for hydroclimatic reconstructions over the CE and highlight the contemporary understanding of how these proxies are interpreted as hydroclimate indicators. We also review the available last-millennium simulations from fully coupled climate models and discuss several outstanding challenges associated with simulating hydroclimate variability and change over the CE. A specific review of simulated hydroclimatic changes forced by volcanic events is provided, as is a discussion of expected improvements in estimated radiative forcings, models, and their implementation in the future. Our review of hydroclimatic proxies and last-millennium model simulations is used as the basis for articulating a variety of considerations and best practices for how to perform proxy–model comparisons of CE hydroclimate. This discussion provides a framework for how best to evaluate hydroclimate variability and its associated dynamics using these comparisons and how they can better inform interpretations of both proxy data and model simulations. We subsequently explore means of using proxy–model comparisons to better constrain and characterize future hydroclimate risks. This is explored specifically in the context of several examples that demonstrate how proxy–model comparisons can be used to quantitatively constrain future hydroclimatic risks as estimated from climate model projections.


2017 ◽  
Vol 13 (12) ◽  
pp. 1831-1850 ◽  
Author(s):  
Kristina Seftigen ◽  
Hugues Goosse ◽  
Francois Klein ◽  
Deliang Chen

Abstract. The integration of climate proxy information with general circulation model (GCM) results offers considerable potential for deriving greater understanding of the mechanisms underlying climate variability, as well as unique opportunities for out-of-sample evaluations of model performance. In this study, we combine insights from a new tree-ring hydroclimate reconstruction from Scandinavia with projections from a suite of forced transient simulations of the last millennium and historical intervals from the CMIP5 and PMIP3 archives. Model simulations and proxy reconstruction data are found to broadly agree on the modes of atmospheric variability that produce droughts–pluvials in the region. Despite these dynamical similarities, large differences between simulated and reconstructed hydroclimate time series remain. We find that the GCM-simulated multi-decadal and/or longer hydroclimate variability is systematically smaller than the proxy-based estimates, whereas the dominance of GCM-simulated high-frequency components of variability is not reflected in the proxy record. Furthermore, the paleoclimate evidence indicates in-phase coherencies between regional hydroclimate and temperature on decadal timescales, i.e., sustained wet periods have often been concurrent with warm periods and vice versa. The CMIP5–PMIP3 archive suggests, however, out-of-phase coherencies between the two variables in the last millennium. The lack of adequate understanding of mechanisms linking temperature and moisture supply on longer timescales has serious implications for attribution and prediction of regional hydroclimate changes. Our findings stress the need for further paleoclimate data–model intercomparison efforts to expand our understanding of the dynamics of hydroclimate variability and change, to enhance our ability to evaluate climate models, and to provide a more comprehensive view of future drought and pluvial risks.


2009 ◽  
Vol 5 (5) ◽  
pp. 2115-2156 ◽  
Author(s):  
M. Widmann ◽  
H. Goosse ◽  
G. van der Schrier ◽  
R. Schnur ◽  
J. Barkmeijer

Abstract. Climate proxy data provide noisy, and spatially incomplete information on some aspects of past climate states, whereas palaeosimulations with climate models provide global, multi-variable states, which may however differ from the true states due to unpredictable internal variability not related to climate forcings, as well as due to model deficiencies. Using data assimilation for combining the empirical information from proxy data with the physical understanding of the climate system represented by the equations in a climate model is in principle a promising way to obtain better estimates for the climate of the past. Data assimilation has been used for a long time in weather forecasting and atmospheric analyses to control the states in atmospheric General Circulation Models such that they are in agreement with observation from surface, upper air, and satellite measurements. Here we discuss the similarities and the differences between the data assimilation problem in palaeoclimatology and in weather forecasting, and present and conceptually compare three data assimilation methods that have been developed in recent years for applications in palaeoclimatology. All three methods (selection of ensemble members, Forcing Singular Vectors, and Pattern Nudging) are illustrated by examples that are related to climate variability over the extratropical Northern Hemisphere during the last millennium. In particular it is shown that all three methods suggest that the cold period over Scandinavia during 1790–1820 is linked to anomalous northerly or easterly atmospheric flow, which in turn is related to a pressure anomaly that resembles a negative state of the Northern Annular Mode.


2010 ◽  
Vol 6 (5) ◽  
pp. 627-644 ◽  
Author(s):  
M. Widmann ◽  
H. Goosse ◽  
G. van der Schrier ◽  
R. Schnur ◽  
J. Barkmeijer

Abstract. Climate proxy data provide noisy, and spatially incomplete information on some aspects of past climate states, whereas palaeosimulations with climate models provide global, multi-variable states, which may however differ from the true states due to unpredictable internal variability not related to climate forcings, as well as due to model deficiencies. Using data assimilation for combining the empirical information from proxy data with the physical understanding of the climate system represented by the equations in a climate model is in principle a promising way to obtain better estimates for the climate of the past. Data assimilation has been used for a long time in weather forecasting and atmospheric analyses to control the states in atmospheric General Circulation Models such that they are in agreement with observation from surface, upper air, and satellite measurements. Here we discuss the similarities and the differences between the data assimilation problem in palaeoclimatology and in weather forecasting, and present and conceptually compare three data assimilation methods that have been developed in recent years for applications in palaeoclimatology. All three methods (selection of ensemble members, Forcing Singular Vectors, and Pattern Nudging) are illustrated by examples that are related to climate variability over the extratropical Northern Hemisphere during the last millennium. In particular it is shown that all three methods suggest that the cold period over Scandinavia during 1790–1820 is linked to anomalous northerly or easterly atmospheric flow, which in turn is related to a pressure anomaly that resembles a negative state of the Northern Annular Mode.


2012 ◽  
Vol 8 (1) ◽  
pp. 263-320 ◽  
Author(s):  
A. Hind ◽  
A. Moberg ◽  
R. Sundberg

Abstract. A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records is developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance changes and greenhouse gas concentrations. Two statistical tests are formulated. Firstly, a preliminary test to establish whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The new methods are applied in a pseudo-proxy experiment. Here, a set of previously published millennial forced model simulations, including both "low" and "high" solar radiative forcing histories together with other common forcings, were used to define "true" target temperatures as well as pseudo-proxy and pseudo-instrumental series. The pseudo-proxies were created to reflect current proxy locations and noise levels, where it was found that the low and high solar full-forcing simulations could be distinguished when the latter were used as targets. When the former were used as targets, a greater number of proxy locations were needed to make this distinction. It was also found that to improve detectability of the low solar simulations, increasing the signal-to-noise ratio was more efficient than increasing the spatial coverage of the proxy network. In the next phase of the work, we will apply these methods to real proxy and instrumental data, with the aim to distinguish which of the two solar forcing histories is most compatible with the observed/reconstructed climate.


2020 ◽  
Author(s):  
Janica Carmen Bühler ◽  
Carla Roesch ◽  
Moritz Kirschner ◽  
Louise Sime ◽  
Max D Holloway ◽  
...  

Abstract. Global changes in the climate, especially the warming trend in mean temperature, have received increasing public and scientific attention. Improving the understanding of changes in the mean and variability of climate variables as well as their interrelation is crucial for reliable climate change projections. Comparisons between general circulation models and paleoclimate archives using indirect proxies for temperature and/or precipitation have been used to test and validate the capability of climate models to represent climate changes. The oxygen isotopic ratio δ18O is routinely measured in speleothem samples at decadal or higher resolution and single specimens can cover full Glacial-Interglacial cycles. The calcium carbonate cave deposits are precisely dateable and provide well preserved (semi-) continuous, albeit multivariate climate signals in the lower and mid-latitudes, where the measured δ18O in the mineral does not directly represent temperature or precipitation. Therefore, speleothems represent suitable archives to assess simulated climate model abilities for the simulation of climate variability beyond the timescales covered by meteorological observations (10–100 yr). Here, we present three transient isotope enabled simulations from the Hadley Center Climate Model version 3 (iHadCM3) covering the last millennium (850–1850 CE) and compare these to a large global dataset of speleothem δ18O records from the Speleothem Isotopes Synthesis and AnaLysis (SISAL) database version 2 (Comas-Bru et al., 2020). We evaluate systematically offsets in mean and variance of simulated δ18O and test for the main climate drivers for individual records or regions. The time-mean spatial offsets between the simulated δ18O and the speleothem data are fairly small. However, using robust filters and spectral analysis, we show that the observed proxy-based variability of δ18O is lower (higher) than simulated by iHadCM3 on decadal (centennial) timescales. Most of this difference can likely be attributed to the records' lower temporal resolution and averaging processes affecting the δ18O signal. Using cross-correlation analyses at site-level and modeled gridbox level, we find evidence for highly variable but generally low signal-to-noise ratios in the proxy data. This points at a high influence of cave-internal processes and regional climate particularities and could suggest low regional representativity of individual sites. Long-range strong positive correlations dominate the speleothem correlation network but are much weaker in the simulation. One reason for this could lie in a lack of longterm internal climate variability in these model simulations, which could be tested by repeating similar comparisons with other isotope-enabled climate models and paleoclimate databases.


2012 ◽  
Vol 8 (4) ◽  
pp. 1339-1353 ◽  
Author(s):  
R. Sundberg ◽  
A. Moberg ◽  
A. Hind

Abstract. A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.


2020 ◽  
Author(s):  
Hugues Goosse ◽  
Gaelle Gilson ◽  
François Klein ◽  
Guillaume Lenoir ◽  
Anne de Vernal ◽  
...  

<p>The mismatch between oceanic proxy data and climate model results over the past millennia has been a long-lasting challenge. Although both are valuable sources of paleoclimate information, there is a strong discrepancy in variance between models and proxies, so that they cannot be compared directly. In addition, local sea-surface temperature (SST) reconstructions are often inconsistent among proxy types. We first performed several offline data assimilation experiments with different standardized SST proxy datasets using the climate models LOVECLIM and CESM in order to investigate the effect of proxy selection on local and regional reconstructions over the Common Era (0-2000 CE). All experiments work technically at the local scale, but the spatial pattern of the reconstructions vary with the type(s), number and density of proxies, and, where there is no proxy, the choice of the model. We then developed empirical scaling factors based on independent SST observations to correct for the discrepancy between model and proxy amplitude. While it is essential to scale proxies, scaling the model leads to complications because of the biases in the sea ice extent. Data assimilation of scaled proxies results in coherent SST reconstructions at the scale of the North Atlantic, with timing and amplitude that are in agreement with those given by forced models. Finally, results are compared to online data assimilation experiments.</p>


Sign in / Sign up

Export Citation Format

Share Document