scholarly journals Sensitivity of snow models to the accuracy of meteorological forcings in mountain environment

2019 ◽  
Author(s):  
Silvia Terzago ◽  
Valentina Andreoli ◽  
Gabriele Arduini ◽  
Gianpaolo Balsamo ◽  
Lorenzo Campo ◽  
...  

Abstract. Snow models are usually evaluated at sites providing high-quality meteorological data, so that the uncertainty in the meteorological input data can be neglected when assessing the model performances. However, high-quality input data are rarely available in mountain areas and, in practical applications, the meteorological forcing to drive snow models is typically derived from spatial interpolation of the available in-situ data or from reanalyses, whose accuracy can be considerably lower. In order to fully characterize the performances of a snow model, the model sensitivity to errors in the input data should be quantified. In this study we test the ability of six snow models to reproduce snow water equivalent, snow density and snow depth when they are forced by meteorological input data with gradually lower accuracy. The SNOWPACK, GEOTOP, HTESSEL, UTOPIA, SMASH and S3M snow models are forced, first, with high-quality measurements performed at the experimental site of Torgnon, located at 2160 m a.s.l. in the Italian Alps (control run). Then, the models are forced by data at gradually lower temporal and/or spatial resolutions, obtained (i) by sampling the original Torgnon 30-minute time series at 3, 6, and 12 hours, (ii) by spatially interpolating neighboring in-situ station measurements and (iii) by extracting information from GLDAS, ERA5, ERA-Interim reanalyses at the gridpoint closest to the Torgnon station. Since the selected models are characterized by different degrees of complexity, from highly sophisticated multi-layer snow models to simple, empirical, single-layer snow schemes, we also discuss the results of these experiments in relation to the model complexity. Results show that when forced by accurate 30-min resolution weather station data the single-layer, intermediate-complexity snow models HTESSEL and UTOPIA provide similar skills as the more sophisticated multi-layer model SNOWPACK, and these three models show better agreement with observations and more robust performances over different seasons compared to the lower complexity models SMASH and S3M. All models forced by 3-hourly data provide similar skills as the control run while with 6- and 12-hourly temporal resolution forcings we generally observe a reduction in model performances, except for the SMASH model which shows low sensitivity to the temporal degradation of the input data. Spatially interpolated data from neighboring stations and reanalyses result to be adequate forcings, provided that temperature and precipitation variables are not affected by large biases over the considered period. A simple bias-adjustment technique applied to ERA-Interim temperatures, however, allowed all models to achieve similar performances as in the control run. All models irrespectively of their complexity show weaknesses in the representation of the snow density.

2020 ◽  
Vol 24 (8) ◽  
pp. 4061-4090 ◽  
Author(s):  
Silvia Terzago ◽  
Valentina Andreoli ◽  
Gabriele Arduini ◽  
Gianpaolo Balsamo ◽  
Lorenzo Campo ◽  
...  

Abstract. Snow models are usually evaluated at sites providing high-quality meteorological data, so that the uncertainty in the meteorological input data can be neglected when assessing model performances. However, high-quality input data are rarely available in mountain areas and, in practical applications, the meteorological forcing used to drive snow models is typically derived from spatial interpolation of the available in situ data or from reanalyses, whose accuracy can be considerably lower. In order to fully characterize the performances of a snow model, the model sensitivity to errors in the input data should be quantified. In this study we test the ability of six snow models to reproduce snow water equivalent, snow density and snow depth when they are forced by meteorological input data with gradually lower accuracy. The SNOWPACK, GEOTOP, HTESSEL, UTOPIA, SMASH and S3M snow models are forced, first, with high-quality measurements performed at the experimental site of Torgnon, located at 2160 m a.s.l. in the Italian Alps (control run). Then, the models are forced by data at gradually lower temporal and/or spatial resolution, obtained by (i) sampling the original Torgnon 30 min time series at 3, 6, and 12 h, (ii) spatially interpolating neighbouring in situ station measurements and (iii) extracting information from GLDAS, ERA5 and ERA-Interim reanalyses at the grid point closest to the Torgnon site. Since the selected models are characterized by different degrees of complexity, from highly sophisticated multi-layer snow models to simple, empirical, single-layer snow schemes, we also discuss the results of these experiments in relation to the model complexity. The results show that, when forced by accurate 30 min resolution weather station data, the single-layer, intermediate-complexity snow models HTESSEL and UTOPIA provide similar skills to the more sophisticated multi-layer model SNOWPACK, and these three models show better agreement with observations and more robust performances over different seasons compared to the lower-complexity models SMASH and S3M. All models forced by 3-hourly data provide similar skills to the control run, while the use of 6- and 12-hourly temporal resolution forcings may lead to a reduction in model performances if the incoming shortwave radiation is not properly represented. The SMASH model generally shows low sensitivity to the temporal degradation of the input data. Spatially interpolated data from neighbouring stations and reanalyses are found to be adequate forcings, provided that temperature and precipitation variables are not affected by large biases over the considered period. However, a simple bias-adjustment technique applied to ERA-Interim temperatures allowed all models to achieve similar performances to the control run. Regardless of their complexity, all models show weaknesses in the representation of the snow density.


2021 ◽  
Author(s):  
Colleen Mortimer ◽  
Lawrence Mudryk ◽  
Chris Derksen ◽  
Kari Luojus ◽  
Pinja Venalainen ◽  
...  

<p>The European Space Agency Snow CCI+ project provides global homogenized long time series of daily snow extent and snow water equivalent (SWE). The Snow CCI SWE product is built on the Finish Meteorological Institute's GlobSnow algorithm, which combines passive microwave data with in situ snow depth information to estimate SWE. The CCI SWE product improves upon previous versions of GlobSnow through targeted changes to the spatial resolution, ancillary data, and snow density parameterization.</p><p>Previous GlobSnow SWE products used a constant snow density of 0.24 kg m<sup>-3</sup> to convert snow depth to SWE. The CCI SWE product applies spatially and temporally varying density fields, derived by krigging in situ snow density information from historical snow transects to correct biases in estimated SWE. Grid spacing was improved from 25 km to 12.5 km by applying an enhanced spatial resolution microwave brightness temperature dataset. We assess step-wise how each of these targeted changes acts to improve or worsen the product by evaluating with snow transect measurements and comparing hemispheric snow mass and trend differences.</p><p>Together, when compared to GlobSnow v3, these changes improved RMSE by ~5 cm and correlation by ~0.1 against a suite of snow transect measurements from Canada, Finland, and Russia. Although the hemispheric snow mass anomalies of CCI SWE and GlobSnow v3 are similar, there are sizeable differences in the climatological SWE, most notably a one month delay in the timing of peak SWE and lower SWE during the accumulation season. These shifts were expected because the variable snow density is lower than the former fixed value of 0.24 kg m<sup>-3</sup> early in the snow season, but then increases over the course of the snow season. We also examine intermediate products to determine the relative improvements attributable solely to the increased spatial resolution versus changes due to the snow density parameterizations. Such systematic evaluations are critical to directing future product development.</p>


2013 ◽  
Vol 7 (2) ◽  
pp. 433-444 ◽  
Author(s):  
C. De Michele ◽  
F. Avanzi ◽  
A. Ghezzi ◽  
C. Jommi

Abstract. The snowpack is a complicated multiphase mixture with mechanical, hydraulic, and thermal properties highly variable during the year in response to climatic forcings. Bulk density is a macroscopic property of the snowpack used, together with snow depth, to quantify the water stored. In seasonal snowpacks, the bulk density is characterized by a strongly non-linear behaviour due to the occurrence of both dry and wet conditions. In the literature, bulk snow density estimates are obtained principally with multiple regressions, and snowpack models have put the attention principally on the snow depth and snow water equivalent. Here a one-dimensional model for the temporal dynamics of the snowpack, with particular attention to the bulk snow density, has been proposed, accounting for both dry and wet conditions. The model represents the snowpack as a two-constituent mixture: a dry part including ice structure, and air; and a wet part constituted by liquid water. It describes the dynamics of three variables: the depth and density of the dry part and the depth of liquid water. The model has been calibrated and validated against hourly data registered at three SNOTEL stations, western US, with mean values of the Nash–Sutcliffe coefficient ≈0.73–0.97 in the validation period.


2012 ◽  
Vol 6 (4) ◽  
pp. 2305-2325
Author(s):  
C. De Michele ◽  
F. Avanzi ◽  
A. Ghezzi ◽  
C. Jommi

Abstract. Snowpack is a complicated multiphase mixture with mechanical, hydraulic, and thermal properties, highly variable within the year in response to climatic forcings. Bulk density is a macroscopic property of the snowpack used, together with snow depth, to quantify the water stored. In seasonal snowpacks, the bulk density is characterized by a strong non-linear behaviour due to the occurrence of both dry and wet conditions. In literature, bulk snow density estimates are obtained principally with multiple regressions, and snowpack models have put the attention principally on the snow depth and snow water equivalent. Here a one-dimensional model for the temporal dynamics of the bulk snow density has been proposed, accounting for both dry and moist conditions. The model assimilates the snowpack to a two-constituent mixture: a dry part including ice structure, and air, and a wet part constituted by liquid water. It describes the dynamics of three variables: the depth and density of the dry part and the depth of liquid water. The model has been calibrated and validated against hourly data registered in two SNOTEL stations, Western US, with mean values of the Nash-Sutcliffe coefficient ≈0.90–0.92.


2017 ◽  
Vol 744 ◽  
pp. 458-462
Author(s):  
Xu Qiao ◽  
Zhi Lin ◽  
Yuan Yuan Si ◽  
Xiao Dan Lin ◽  
Shao Wei Cui ◽  
...  

High-quality graphene is prepared via In Situ hydrogen exfoliation of the reaction of stage-1 FeCl3-GIC with sodium borohydride solution, followed by washings and sonication. The hydrogen evolved from the borohydride exfoliates the GIC and reduces defect structure in the graphene simultaneously, make it more conjugated. Raman spectrum results show the intensity ratio of the D and G peak is about 0.09, even smaller than that of the original graphite, which is 0.17. The only C1s peak locating at 284.9 eV in another way supports the only one structure in the graphene. SEM image of exfoliated graphene Fig. 2(f) shows that the graphene obtained has curly morphology, which is significantly different from graphite flakes. TEM of the graphene shows a single layer graphene and its overlap with other graphene. Atomic force microscopy (AFM) measure shows that the average thickness of graphene sheets is about 0.530 nm. Proving that the high quality graphene prepared is chiefly single layer. After compression molded into graphene mat, its conductivity reaches 2.85×105S/m, which is about one third of the theoretical value of graphene. This method is promising for mass production of high-quality graphene.


2021 ◽  
Vol 15 (12) ◽  
pp. 5371-5386
Author(s):  
Achut Parajuli ◽  
Daniel F. Nadeau ◽  
François Anctil ◽  
Marco Alves

Abstract. Cold content (CC) is an internal energy state within a snowpack and is defined by the energy deficit required to attain isothermal snowmelt temperature (0 ∘C). Cold content for a given snowpack thus plays a critical role because it affects both the timing and the rate of snowmelt. Measuring cold content is a labour-intensive task as it requires extracting in situ snow temperature and density. Hence, few studies have focused on characterizing this snowpack variable. This study describes the multilayer cold content of a snowpack and its variability across four sites with contrasting canopy structures within a coniferous boreal forest in southern Québec, Canada, throughout winter 2017–2018. The analysis was divided into two steps. In the first step, the observed CC data from weekly snowpits for 60 % of the snow cover period were examined. During the second step, a reconstructed time series of modelled CC was produced and analyzed to highlight the high-resolution temporal variability of CC for the full snow cover period. To accomplish this, the Canadian Land Surface Scheme (CLASS; featuring a single-layer snow model) was first implemented to obtain simulations of the average snow density at each of the four sites. Next, an empirical procedure was used to produce realistic density profiles, which, when combined with in situ continuous snow temperature measurements from an automatic profiling station, provides a time series of CC estimates at half-hour intervals for the entire winter. At the four sites, snow persisted on the ground for 218 d, with melt events occurring on 42 of those days. Based on snowpit observations, the largest mean CC (−2.62 MJ m−2) was observed at the site with the thickest snow cover. The maximum difference in mean CC between the four study sites was −0.47 MJ m−2, representing a site-to-site variability of 20 %. Before analyzing the reconstructed CC time series, a comparison with snowpit data confirmed that CLASS yielded reasonable bulk estimates of snow water equivalent (SWE) (R2=0.64 and percent bias (Pbias) =-17.1 %), snow density (R2=0.71 and Pbias =1.6 %), and cold content (R2=0.93 and Pbias =-3.3 %). A snow density profile derived by utilizing an empirical formulation also provided reasonable estimates of layered cold content (R2=0.42 and Pbias =5.17 %). Thanks to these encouraging results, the reconstructed and continuous CC series could be analyzed at the four sites, revealing the impact of rain-on-snow and cold air pooling episodes on the variation of CC. The continuous multilayer cold content time series also provided us with information about the effect of stand structure, local topography, and meteorological conditions on cold content variability. Additionally, a weak relationship between canopy structure and CC was identified.


2021 ◽  
Author(s):  
Achut Parajuli ◽  
Daniel F. Nadeau ◽  
François Anctil ◽  
Marco Alves

Abstract. Cold content (CC) is an internal energy state within a snowpack and is defined by the energy deficit required to attain isothermal snowmelt temperature (0 °C). For any snowpack, fulfilling the cold content deficit is a pre-requisite before the onset of the snowmelt. Cold content for a given snowpack thus plays a critical role because it affects both the timing and the rate of snowmelt. Estimating the cold content is a labour-intensive task as it requires extracting in-situ snow temperature and density. Hence, few studies have focused on characterizing this snowpack variable. This study describes the multilayer cold content of a snowpack and its variability across four sites with contrasting canopy structures within a coniferous boreal forest in southern Québec, Canada, throughout winter 2017–18. The analysis was divided into two steps. In the first step, the observed CC data from weekly snowpits for 60 % of the snow cover period were examined. During the second step, a reconstructed time series of CC was produced and analyzed to highlight the high-resolution temporal variability of CC for the full snow cover period. To accomplish this, the Canadian Land Surface Scheme (CLASS; featuring a single-layer snow model) was first implemented to obtain simulations of the average snow density at each of the four sites. Next, an empirical procedure was used to produce realistic density profiles, which, when combined with in situ continuous snow temperature measurements from an automatic profiling station, provides a time series of CC estimates at half-hour intervals for the entire winter. At the four sites, snow persisted on the ground for 218 days, with melt events occurring on 42 of those days. Based on snowpit observations, the largest mean CC (−2.62 MJ m−2) was observed at the site with the thickest snow cover. The maximum difference in mean CC between the four study sites was −0.47 MJ m−2, representing a site-to-site variability of 20 %. Before analyzing the reconstructed CC time series, a comparison with snowpit data confirmed that CLASS yielded reasonable estimates of the snow water equivalent (SWE) (R2 = 0.64 and percent bias (Pbias) = −17.1 %), bulk snow density (R2 = 0.71 and Pbias = 1.6 %), and bulk cold content (R2 = 0.90 and Pbias = −2.0 %). A snow density profile derived by utilizing an empirical formulation also provided reasonable estimates of cold content (R2 = 0.42 and Pbias = 5.17 %). Thanks to these encouraging results, the reconstructed and continuous CC series could be analyzed at the four sites, revealing the impact of rain-on-snow and cold air pooling episodes on the variation of CC. The continuous multilayer cold content time series also provided us with information about the effect of stand structure, local topography, and meteorological conditions on cold content variability. Additionally, a weak relationship between canopy structure and CC was identified.


Author(s):  
Yoshichika Bando ◽  
Takahito Terashima ◽  
Kenji Iijima ◽  
Kazunuki Yamamoto ◽  
Kazuto Hirata ◽  
...  

The high quality thin films of high-Tc superconducting oxide are necessary for elucidating the superconducting mechanism and for device application. The recent trend in the preparation of high-Tc films has been toward “in-situ” growth of the superconducting phase at relatively low temperatures. The purpose of “in-situ” growth is to attain surface smoothness suitable for fabricating film devices but also to obtain high quality film. We present the investigation on the initial growth manner of YBCO by in-situ reflective high energy electron diffraction (RHEED) technique and on the structural and superconducting properties of the resulting ultrathin films below 100Å. The epitaxial films have been grown on (100) plane of MgO and SrTiO, heated below 650°C by activated reactive evaporation. The in-situ RHEED observation and the intensity measurement was carried out during deposition of YBCO on the substrate at 650°C. The deposition rate was 0.8Å/s. Fig. 1 shows the RHEED patterns at every stage of deposition of YBCO on MgO(100). All the patterns exhibit the sharp streaks, indicating that the film surface is atomically smooth and the growth manner is layer-by-layer.


2021 ◽  
Vol 7 (9) ◽  
pp. eabf0116
Author(s):  
Shiqi Huang ◽  
Shaoxian Li ◽  
Luis Francisco Villalobos ◽  
Mostapha Dakhchoune ◽  
Marina Micari ◽  
...  

Etching single-layer graphene to incorporate a high pore density with sub-angstrom precision in molecular differentiation is critical to realize the promising high-flux separation of similar-sized gas molecules, e.g., CO2 from N2. However, rapid etching kinetics needed to achieve the high pore density is challenging to control for such precision. Here, we report a millisecond carbon gasification chemistry incorporating high density (>1012 cm−2) of functional oxygen clusters that then evolve in CO2-sieving vacancy defects under controlled and predictable gasification conditions. A statistical distribution of nanopore lattice isomers is observed, in good agreement with the theoretical solution to the isomer cataloging problem. The gasification technique is scalable, and a centimeter-scale membrane is demonstrated. Last, molecular cutoff could be adjusted by 0.1 Å by in situ expansion of the vacancy defects in an O2 atmosphere. Large CO2 and O2 permeances (>10,000 and 1000 GPU, respectively) are demonstrated accompanying attractive CO2/N2 and O2/N2 selectivities.


Sign in / Sign up

Export Citation Format

Share Document