Planned Burn-Piedmont. A local operational numerical meteorological model for tracking smoke on the ground at night: model development and sensitivity tests

2005 ◽  
Vol 14 (1) ◽  
pp. 85 ◽  
Author(s):  
Gary L. Achtemeier

Smoke from both prescribed fires and wildfires can, under certain meteorological conditions, become entrapped within shallow layers of air near the ground at night and get carried to unexpected destinations as a combination of weather systems push air through interlocking ridge–valley terrain typical of the Piedmont of the Southern United States. Entrapped smoke confined within valleys is often slow to disperse. When moist conditions are present, hygroscopic particles within smoke may initiate or augment fog formation. With or without fog, smoke transported across roadways can create visibility hazards. Planned Burn (PB)-Piedmont is a fine scale, time-dependent, smoke tracking model designed to run on a PC computer as an easy-to-use aid for land managers. PB-Piedmont gives high-resolution in space and time predictions of smoke movement within shallow layers at the ground over terrain typical of that of the Piedmont. PB-Piedmont applies only for weather conditions when smoke entrapment is most likely to occur––at night during clear skies and light winds. This paper presents the model description and gives examples of model performance in comparison with observations of entrapped smoke collected during two nights of a field project. The results show that PB-Piedmont is capable of describing the movement of whole smoke plumes within the constraints for which the model was designed.

2021 ◽  
Vol 9 (5) ◽  
pp. 467
Author(s):  
Mostafa Farrag ◽  
Gerald Corzo Perez ◽  
Dimitri Solomatine

Many grid-based spatial hydrological models suffer from the complexity of setting up a coherent spatial structure to calibrate such a complex, highly parameterized system. There are essential aspects of model-building to be taken into account: spatial resolution, the routing equation limitations, and calibration of spatial parameters, and their influence on modeling results, all are decisions that are often made without adequate analysis. In this research, an experimental analysis of grid discretization level, an analysis of processes integration, and the routing concepts are analyzed. The HBV-96 model is set up for each cell, and later on, cells are integrated into an interlinked modeling system (Hapi). The Jiboa River Basin in El Salvador is used as a case study. The first concept tested is the model structure temporal responses, which are highly linked to the runoff dynamics. By changing the runoff generation model description, we explore the responses to events. Two routing models are considered: Muskingum, which routes the runoff from each cell following the river network, and Maxbas, which routes the runoff directly to the outlet. The second concept is the spatial representation, where the model is built and tested for different spatial resolutions (500 m, 1 km, 2 km, and 4 km). The results show that the spatial sensitivity of the resolution is highly linked to the routing method, and it was found that routing sensitivity influenced the model performance more than the spatial discretization, and allowing for coarser discretization makes the model simpler and computationally faster. Slight performance improvement is gained by using different parameters’ values for each cell. It was found that the 2 km cell size corresponds to the least model error values. The proposed hydrological modeling codes have been published as open-source.


2010 ◽  
Vol 61 (4) ◽  
pp. 825-839 ◽  
Author(s):  
H. Hauduc ◽  
L. Rieger ◽  
I. Takács ◽  
A. Héduit ◽  
P. A. Vanrolleghem ◽  
...  

The quality of simulation results can be significantly affected by errors in the published model (typing, inconsistencies, gaps or conceptual errors) and/or in the underlying numerical model description. Seven of the most commonly used activated sludge models have been investigated to point out the typing errors, inconsistencies and gaps in the model publications: ASM1; ASM2d; ASM3; ASM3 + Bio-P; ASM2d + TUD; New General; UCTPHO+. A systematic approach to verify models by tracking typing errors and inconsistencies in model development and software implementation is proposed. Then, stoichiometry and kinetic rate expressions are checked for each model and the errors found are reported in detail. An attached spreadsheet (see http://www.iwaponline.com/wst/06104/0898.pdf) provides corrected matrices with the calculations of all stoichiometric coefficients for the discussed biokinetic models and gives an example of proper continuity checks.


2019 ◽  
Vol 31 (2) ◽  
Author(s):  
Anika Nowshin Mowrin ◽  
Md. Hadiuzzaman ◽  
Saurav Barua ◽  
Md. Mizanur Rahman

Commuter train is a viable alternative to road transport to ease the traffic congestion which requires appropriate planning by concerned authorities. The research is aimed to assess passengers’ perception about commuter train service running in areas near Dhaka city. An Adaptive Neuro Fuzzy Inference System (ANFIS) model has been developed to evaluate service quality (SQ) of commuter train. Field survey data has been conducted among 802 respondents who were the regular user of commuter train and 12 attributes have been selected for model development. ANFIS was developed by the training and then tested by 80% and 20% of the total sample respectively. After that, model performance has been evaluated by (i) Confusion Matrix (ii) Root Mean Square Error (RMSE) and attributes are ranked based on their relative importance. The proposed ANFIS model has 61.50% accuracy in training and 47.80% accuracy in testing.  From the results, it is found that 'Bogie condition', 'Cleanliness', ‘Female harassment’, 'Behavior of staff' and 'Toilet facility' are the most significant attributes. This indicates that some necessary measures should be taken immediately to recover the effects of these attributes to improve the SQ of commuter train. 


Risks ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 204
Author(s):  
Chamay Kruger ◽  
Willem Daniel Schutte ◽  
Tanja Verster

This paper proposes a methodology that utilises model performance as a metric to assess the representativeness of external or pooled data when it is used by banks in regulatory model development and calibration. There is currently no formal methodology to assess representativeness. The paper provides a review of existing regulatory literature on the requirements of assessing representativeness and emphasises that both qualitative and quantitative aspects need to be considered. We present a novel methodology and apply it to two case studies. We compared our methodology with the Multivariate Prediction Accuracy Index. The first case study investigates whether a pooled data source from Global Credit Data (GCD) is representative when considering the enrichment of internal data with pooled data in the development of a regulatory loss given default (LGD) model. The second case study differs from the first by illustrating which other countries in the pooled data set could be representative when enriching internal data during the development of a LGD model. Using these case studies as examples, our proposed methodology provides users with a generalised framework to identify subsets of the external data that are representative of their Country’s or bank’s data, making the results general and universally applicable.


2020 ◽  
Author(s):  
Bernd Schalge ◽  
Gabriele Baroni ◽  
Barbara Haese ◽  
Daniel Erdal ◽  
Gernot Geppert ◽  
...  

Abstract. Coupled numerical models, which simulate water and energy fluxes in the subsurface-land surface-atmosphere system in a physically consistent way are a prerequisite for the analysis and a better understanding of heat and matter exchange fluxes at compartmental boundaries and interdependencies of states across these boundaries. Complete state evolutions generated by such models may be regarded as a proxy of the real world, provided they are run at sufficiently high resolution and incorporate the most important processes. Such a virtual reality can be used to test hypotheses on the functioning of the coupled terrestrial system. Coupled simulation systems, however, face severe problems caused by the vastly different scales of the processes acting in and between the compartments of the terrestrial system, which also hinders comprehensive tests of their realism. We used the Terrestrial Systems Modeling Platform TerrSysMP, which couples the meteorological model COSMO, the land-surface model CLM, and the subsurface model ParFlow, to generate a virtual catchment for a regional terrestrial system mimicking the Neckar catchment in southwest Germany. Simulations for this catchment are made for the period 2007–2015, and at a spatial resolution of 400 m for the land surface and subsurface and 1.1 km for the atmosphere. Among a discussion of modelling challenges, the model performance is evaluated based on real observations covering several variables of the water cycle. We find that the simulated (virtual) catchment behaves in many aspects quite close to observations of the real Neckar catchment, e.g. concerning atmospheric boundary-layer height, precipitation, and runoff. But also discrepancies become apparent, both in the ability of the model to correctly simulate some processes which still need improvement such as overland flow, and in the realism of some observation operators like the satellite based soil moisture sensors. The whole raw dataset is available for interested users. The dataset described here is available via the CERA database (Schalge et al., 2020): https://doi.org/10.26050/WDCC/Neckar_VCS_v1.


2021 ◽  
Author(s):  
Thomas Weninger ◽  
Simon Scheper ◽  
Nathan King ◽  
Karl Gartner ◽  
Barbara Kitzler ◽  
...  

<p>Wind erosion of arable soil is considered a risk factor for Austrian fields, but direct measurements of soil loss are not available until now. Despite this uncertainty, vegetated windbreaks have been established to minimize adverse wind impacts on arable land. The study addresses these questions: i) How relevant is wind erosion as a factor of soil degradation? ii) How important is the protective effect of vegetated windbreaks? iii) Are systematic patterns of spatial and temporal variability of wind erosion rates detectable in response to weather conditions? </p><p>Two experimental fields adjacent to windbreaks were equipped with sediment traps, soil moisture sensors, and meteorological measurement equipment for microclimatic patterns. Sediment traps were arranged in high spatial resolution from next to the windbreak to a distance of ten times the windbreak height. Beginning in January 2020, the amount of trapped sediment was analyzed every three weeks. The highest wind erosion rates on bare soil were observed in June and July. For unprotected fields with bare soil, upscaled annual erosion rates were as high as 0.8 tons per hectare, and sediment trapped increased in a linear fashion with distance from the windbreak. Soil water content near the surface (5 cm depth) was three percent higher at a distance of two times the height of the windbreak than at a distance of six times the height. For the same respective distances from the windbreak, we observed 29 days of soil water contents below the wilting point compared with 60 days.</p><p>The preliminary outcomes confirmed the expected effects of windbreaks on soil erosion and microclimate in agricultural fields. Prospective results from multiple vegetation periods will be used in an upscaling approach to gain informations for the whole basin. That is meant to be done by a combination with a soil wind erosion model which was so far used for regional modelling of wind erosion susceptibility.</p>


2020 ◽  
Author(s):  
Sandip Som ◽  
Saibal Ghosh ◽  
Soumitra Dasgupta ◽  
Thrideep Kumar ◽  
J. N. Hindayar ◽  
...  

Abstract Modeling landslide susceptibility is one of the important aspects of land use planning and risk management. Several modeling methods are available based either on highly specialized knowledge on causative attributes or on good landslide inventory data to use as training and testing attribute on model development. Understandably, these two criteria are rarely available for local land regulators. This paper presents a new model methodology, which requires minimum knowledge of causative attributes and does not depend on landslide inventory. As landslide causes due to the combined effect of causative attributes, this model utilizes communality (common variance) of the attributes, extracted by exploratory factor analysis and used for calculation of landslide susceptibility index. The model can understand the inter-relationship of different geo-environmental attributes responsible for landslide along with identification and prioritization of attributes on model performance to delineate non-performing attributes. Finally, the model performance is compared with the well established AHP method (knowledge driven) and FRM method (data driven) by cut-off independent ROC curves along with cost-effectiveness. The model shows it’s performance almost at par with the established models, involving minimum modeling expertise. The findings and results of the present work will be helpful for the town planners and engineers on a regional scale for generalized planning and assessment.


2015 ◽  
Vol 8 (11) ◽  
pp. 3579-3591 ◽  
Author(s):  
T. Zhang ◽  
L. Li ◽  
Y. Lin ◽  
W. Xue ◽  
F. Xie ◽  
...  

Abstract. Physical parameterizations in general circulation models (GCMs), having various uncertain parameters, greatly impact model performance and model climate sensitivity. Traditional manual and empirical tuning of these parameters is time-consuming and ineffective. In this study, a "three-step" methodology is proposed to automatically and effectively obtain the optimum combination of some key parameters in cloud and convective parameterizations according to a comprehensive objective evaluation metrics. Different from the traditional optimization methods, two extra steps, one determining the model's sensitivity to the parameters and the other choosing the optimum initial value for those sensitive parameters, are introduced before the downhill simplex method. This new method reduces the number of parameters to be tuned and accelerates the convergence of the downhill simplex method. Atmospheric GCM simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9 %. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameter tuning during the model development stage.


2011 ◽  
Vol 11 (11) ◽  
pp. 2965-2979 ◽  
Author(s):  
L. Bertotti ◽  
P. Canestrelli ◽  
L. Cavaleri ◽  
F. Pastore ◽  
L. Zampato

Abstract. We describe the Henetus wave forecast system in the Adriatic Sea. Operational since 1996, the system is continuously upgraded, especially through the correction of the input ECMWF wind fields. As these fields are of progressively improved quality with the increasing resolution of the meteorological model, the correction needs to be correspondingly updated. This ensures a practically constant quality of the Henetus results in the Adriatic Sea since 1996. After suitable and extended validation of the quality of the results at different forecast ranges, the operational range has been recently extended to five days. The Henetus results are used also to improve the tidal forecast on the Venetian coasts and the Venice lagoon, particularly during the most severe events. Extensive statistics on the model performance are provided, both as analysis and forecast, by comparing the model results versus both satellite and buoy data.


2018 ◽  
Vol 184 ◽  
pp. 01013
Author(s):  
Peter Möller

The macroscopic-microscopic model based on the folded-Yukawa singleparticle potential and a “finite-range” macroscopic model is probably the approach that has provided the most reliable predictions of a large number of nuclear-structure properties for all nuclei between the proton and neutron drip lines. I will describe some basic features of the model and the development philosophy that may be the reason for its success. Examples of quantities modeled within the same model framework are, nuclear masses, ground-state level structure, including spins, ground-state shapes, fission barriers, heavy-ion fusion barriers, sub-barrier fusion cross sections, β-decay half-lives and delayed neutron emission probabilities, shape coexistence, and α-decay Qα energies to name a few. I will show how well it predicted various properties measured after published results. Rather than giving an incomplete model description here I will give a timeline of model development and provide references to typical applications and references that are sufficiently complete that several individuals have written computer codes based on these references, codes whose results have excellent agreement with ours.


Sign in / Sign up

Export Citation Format

Share Document