scholarly journals Collection/aggregation algorithms in Lagrangian cloud microphysical models: Rigorous evaluation in box model simulations

Author(s):  
Simon Unterstrasser ◽  
Fabian Hoffmann ◽  
Marion Lerch

Abstract. Recently, several Lagrangian microphysical models have been developed which use a large number of (computational) particles to represent a cloud. In particular, the collision process leading to coalescence of cloud droplets or aggregation of ice crystals is implemented differently in the various models. Three existing implementations are reviewed and extended, and their performance is evaluated by a comparison with well established analytical and bin model solutions. In this first step of rigorous evaluation, box model simulations with collection/aggregation being the only process considered have been performed for the three well-known kernels of Golovin, Long and Hall. Besides numerical parameters like the time step and the number of simulation particles (SIPs) used, the details of how the initial SIP ensemble is created from a prescribed analytically defined size distribution is crucial for the performance of the algorithms. Using a constant weight technique as done in previous studies greatly underestimates the quality of the algorithms. Using better initialisation techniques considerably reduces the number of required SIPs to obtain realistic results. From the box model results recommendations for the collection/aggregation implementation in higher dimensional model setups are derived. Suitable algorithms are equally relevant to treating the warm-rain process and aggregation in cirrus.

2017 ◽  
Vol 10 (4) ◽  
pp. 1521-1548 ◽  
Author(s):  
Simon Unterstrasser ◽  
Fabian Hoffmann ◽  
Marion Lerch

Abstract. Recently, several Lagrangian microphysical models have been developed which use a large number of (computational) particles to represent a cloud. In particular, the collision process leading to coalescence of cloud droplets or aggregation of ice crystals is implemented differently in various models. Three existing implementations are reviewed and extended, and their performance is evaluated by a comparison with well-established analytical and bin model solutions. In this first step of rigorous evaluation, box model simulations, with collection/aggregation being the only process considered, have been performed for the three well-known kernels of Golovin, Long and Hall. Besides numerical parameters, like the time step and the number of simulation particles (SIPs) used, the details of how the initial SIP ensemble is created from a prescribed analytically defined size distribution is crucial for the performance of the algorithms. Using a constant weight technique, as done in previous studies, greatly underestimates the quality of the algorithms. Using better initialisation techniques considerably reduces the number of required SIPs to obtain realistic results. From the box model results, recommendations for the collection/aggregation implementation in higher dimensional model setups are derived. Suitable algorithms are equally relevant to treating the warm rain process and aggregation in cirrus.


2019 ◽  
Vol 29 (Supplement_4) ◽  
Author(s):  
M Perkiö ◽  
R Harrison ◽  
M Grivna ◽  
D Tao ◽  
C Evashwich

Abstract Education is a key to creating solidary among the professionals who advance public health’s interdisciplinary mission. Our assumption is that if all those who work in public health shared core knowledge and the skills for interdisciplinary interaction, collaboration across disciplines, venues, and countries would be facilitated. Evaluation of education is an essential element of pedagogy to ensure quality and consistency across boundaries, as articulated by the UNESCO education standards. Our study examined the evaluation studies done by programs that educate public health professionals. We searched the peer reviewed literature published in English between 2000-2017 pertaining to the education of the public health workforce at a degree-granting level. The 2442 articles found covered ten health professions disciplines and had lead authors representing all continents. Only 86 articles focused on evaluation. The majority of the papers examined either a single course, a discipline-specific curriculum or a teaching method. No consistent methodologies could be discerned. Methods ranged from sophisticated regression analyses and trends tracked over time to descriptions of focus groups and interviews of small samples. We found that evaluations were primarily discipline-specific, lacked rigorous methodology in many instances, and that relatively few examined competencies or career expectations. The public health workforce enjoys a diversity of disciplines but must be able to come together to share diverse knowledge and skills. Evaluation is critical to achieving a workforce that is well trained in the competencies pertinent to collaboration. This study informs the pedagogical challenges that must be confronted going forward, starting with a commitment to shared core competencies and to consistent and rigorous evaluation of the education related to training public health professionals. Key messages Rigorous evaluation is not sufficiently used to enhance the quality of public health education. More frequent use of rigorous evaluation in public health education would enhance the quality of public health workforce, and enable cross-disciplinary and international collaboration for solidarity.


2020 ◽  
Author(s):  
Andrew Gettelman ◽  
David John Gagne ◽  
Chih-Chieh Chen ◽  
Matthew Christensen ◽  
Zachary Lebo ◽  
...  

2016 ◽  
Vol 74 (5) ◽  
pp. 507-550 ◽  
Author(s):  
Carrie H. Colla ◽  
Alexander J. Mainor ◽  
Courtney Hargreaves ◽  
Thomas Sequist ◽  
Nancy Morden

The effectiveness of different types of interventions to reduce low-value care has been insufficiently summarized to allow for translation to practice. This article systematically reviews the literature on the effectiveness of interventions to reduce low-value care and the quality of those studies. We found that multicomponent interventions addressing both patient and clinician roles in overuse have the greatest potential to reduce low-value care. Clinical decision support and performance feedback are promising strategies with a solid evidence base, and provider education yields changes by itself and when paired with other strategies. Further research is needed on the effectiveness of pay-for-performance, insurer restrictions, and risk-sharing contracts to reduce use of low-value care. While the literature reveals important evidence on strategies used to reduce low-value care, meaningful gaps persist. More experimentation, paired with rigorous evaluation and publication, is needed.


2016 ◽  
Vol 16 (18) ◽  
pp. 11601-11615 ◽  
Author(s):  
Jane Coates ◽  
Kathleen A. Mar ◽  
Narendra Ojha ◽  
Tim M. Butler

Abstract. Surface ozone is a secondary air pollutant produced during the atmospheric photochemical degradation of emitted volatile organic compounds (VOCs) in the presence of sunlight and nitrogen oxides (NOx). Temperature directly influences ozone production through speeding up the rates of chemical reactions and increasing the emissions of VOCs, such as isoprene, from vegetation. In this study, we used an idealised box model with different chemical mechanisms (Master Chemical Mechanism, MCMv3.2; Common Representative Intermediates, CRIv2; Model for OZone and Related Chemical Tracers, MOZART-4; Regional Acid Deposition Model, RADM2; Carbon Bond Mechanism, CB05) to examine the non-linear relationship between ozone, NOx and temperature, and we compared this to previous observational studies. Under high-NOx conditions, an increase in ozone from 20 to 40 °C of up to 20 ppbv was due to faster reaction rates, while increased isoprene emissions added up to a further 11 ppbv of ozone. The largest inter-mechanism differences were obtained at high temperatures and high-NOx emissions. CB05 and RADM2 simulated more NOx-sensitive chemistry than MCMv3.2, CRIv2 and MOZART-4, which could lead to different mitigation strategies being proposed depending on the chemical mechanism. The increased oxidation rate of emitted VOC with temperature controlled the rate of Ox production; the net influence of peroxy nitrates increased net Ox production per molecule of emitted VOC oxidised. The rate of increase in ozone mixing ratios with temperature from our box model simulations was about half the rate of increase in ozone with temperature observed over central Europe or simulated by a regional chemistry transport model. Modifying the box model set-up to approximate stagnant meteorological conditions increased the rate of increase of ozone with temperature as the accumulation of oxidants enhanced ozone production through the increased production of peroxy radicals from the secondary degradation of emitted VOCs. The box model simulations approximating stagnant conditions and the maximal ozone production chemical regime reproduced the 2 ppbv increase in ozone per degree Celsius from the observational and regional model data over central Europe. The simulated ozone–temperature relationship was more sensitive to mixing than the choice of chemical mechanism. Our analysis suggests that reductions in NOx emissions would be required to offset the additional ozone production due to an increase in temperature in the future.


2018 ◽  
Author(s):  
Anne Wiese ◽  
Joanna Staneva ◽  
Johannes Schultz-Stellenfleth ◽  
Arno Behrens ◽  
Luciana Fenoglio-Marc ◽  
...  

Abstract. In this study, the quality of wind and wave data provided by the new Sentinel-3A satellite is evaluated. We focus on coastal areas, where altimeter data are of lower quality than those for the open ocean. The satellite data of Sentinel-3A, Jason-2 and CryoSat-2 are assessed in a comparison with in situ measurements and spectral wave model (WAM) simulations. The sensitivity of the wave model to wind forcing is evaluated using data with different temporal and spatial resolution, such as ERA-Interim and ERA5 reanalyses, ECMWF operational analysis and short-range forecasts, German Weather Service (DWD) forecasts and regional atmospheric model simulations -coastDat. Numerical simulations show that both the wave model forced using the ERA5 reanalyses and that forced using the ECMWF operational analysis/forecast demonstrate the best capability over the whole study period, as well as during extreme events. To further estimate the variance of the significant wave height of ensemble members for different wind forcings, especially during extreme events, an empirical orthogonal function (EOF) analysis is performed. Intercomparisons between remote sensing and in situ observations demonstrate that the overall quality of the former is good over the North Sea and Baltic Sea throughout the study period, although the significant wave heights estimated based on satellite data tend to be greater than the in situ measurements by 7 cm to 26 cm. The quality of all satellite data near the coastal area decreases; however, within 10 km off the coast, Sentinel-3A performs better than the other two satellites. Analyses in which data from satellite tracks are separated in terms of onshore and offshore flights have been carried out. No substantial differences are found when comparing the statistics for onshore and offshore flights. Moreover, no substantial differences are found between satellite tracks under various metocean conditions. Furthermore, the satellite data quality does not depend on the wind direction relative to the flight direction. Thus, the quality of the data obtained by the new Sentinel-3A satellite over coastal areas is improved compared to that of older satellites.


2006 ◽  
Vol 26 (1) ◽  
pp. 105-134 ◽  
Author(s):  
EMILY GRUNDY

This paper considers the processes and circumstances that create vulnerability among older people, specifically to a very poor quality of life or an untimely or degrading death. Models of ageing processes are used to define vulnerable older people as those whose reserve capacity falls below the threshold needed to cope successfully with the challenges they face. Compensatory supports may intervene to mitigate the effects of challenges and to rebuild reserve. The dimensions of reserve, challenges and compensation are discussed, with emphasis on demographic and other influences on the availability of family and social support. Policy initiatives to reduce vulnerability can focus on each part of the dynamic process that creates vulnerability, namely, ensuring that people reach later life with ‘reserve’, reducing the challenges they face in later life, and providing adequate compensatory supports. The promotion through the lifecourse of healthy lifestyles and the acquisition of coping skills, strong family and social ties, active interests, and savings and assets, will develop reserves and ensure that they are strong in later life. Some of the physical and psychological challenges that people may face as they age cannot be modified, but others can. Interventions to develop compensatory supports include access to good acute care and rehabilitation when needed, substitute professional social and psychological help in times of crisis, long-term help and income support. Our knowledge of which interventions are most effective is however limited by the paucity of rigorous evaluation studies.


2005 ◽  
Vol 44 (4) ◽  
pp. 445-466 ◽  
Author(s):  
Jerry M. Straka ◽  
Edward R. Mansell

Abstract A single-moment bulk microphysics scheme with multiple ice precipitation categories is described. It has 2 liquid hydrometeor categories (cloud droplets and rain) and 10 ice categories that are characterized by habit, size, and density—two ice crystal habits (column and plate), rimed cloud ice, snow (ice crystal aggregates), three categories of graupel with different densities and intercepts, frozen drops, small hail, and large hail. The concept of riming history is implemented for conversions among the graupel and frozen drops categories. The multiple precipitation ice categories allow a range of particle densities and fall velocities for simulating a variety of convective storms with minimal parameter tuning. The scheme is applied to two cases—an idealized continental multicell storm that demonstrates the ice precipitation process, and a small Florida maritime storm in which the warm rain process is important.


2005 ◽  
Vol 5 (4) ◽  
pp. 879-885 ◽  
Author(s):  
S. Romakkaniemi ◽  
H. Kokkola ◽  
A. Laaksonen

Abstract. In this paper we present a parameterization of the nitric acid effect on cloud droplet formation. The new parameterization is intended to be used in large scale models in order to obtain regional and global estimates of the effect of nitric acid on cloud drop concentrations and the radiative balance. The parameterization is based on numerical air parcel model simulations and can be applied for unimodal and bimodal lognormal aerosol particle size distributions in a large variety of different conditions. In addition to the aerosol particle distribution and gas-phase HNO3 concentration, the parameterization requires temperature, total pressure, updraft velocity, and the number concentration of cloud droplets formed at zero nitric acid concentration, as input parameters. The parameterization is also suitable for describing the effect of hydrochloric acid on the cloud drop concentrations, and in practice, the HNO3 and HCl concentrations can be summed up to yield the total effect. The comparison between the parameterization and the results from numerical air parcel model simulations show good consistency.


2020 ◽  
Author(s):  
Bibi S Naz ◽  
Wendy Sharples ◽  
Klaus Goergen ◽  
Stefan Kollet

<p> <span>High-resolution large-scale predictions of hydrologic states and fluxes are important for many regional-scale applications and water resource management. However, because of uncertainties related to forcing data, model structural errors arising from simplified representations of hydrological processes or uncertain model parameters, model simulations remain uncertain. To quantify this uncertainty, multi-model simulations were performed at 3km resolution over the European continent using the Community Land Model (CLM3.5) and the ParFlow hydrologic model. While Parflow uses a similar approach as CLM in simulating the snow, vegetation and land-atmosphere exchange processes, it simulates three-dimensional variably saturated groundwater flow solving Richards equation and overland flow with a two-dimensional kinematic wave approximation. </span><span>The </span><span>CLM</span><span>3.5</span><span> uses a simple groundwater model to account for groundwater recharge and discharge processes. Both models were driven with the COSMO-REA6 reanalysis dataset at 6km resolution for the time period from 2000 to 2006 at an hourly time step, and both used the same datasets for the static input variables (such as topography, vegetation and soil properties). The performance of both models was analyzed through comparisons with independent observations including satellite-derived and in-situ soil moisture, evapotranspiration, river discharge, water table depth and total water storage datasets. Overall, both models capture the interannual variability in the hydrologic states and fluxes well, however differences in performance between models showed the uncertainty associated with the representation of hydrological processes, such as groundwater flow and soil moisture and its control on latent and sensible heat fluxes at the surface.</span></p>


Sign in / Sign up

Export Citation Format

Share Document