scholarly journals Multiobjective Calibration Framework for Pedestrian Simulation Models: A study on the Effect of Movement Base Cases, Metrics, and Density Levels

2019 ◽  
Vol 2019 ◽  
pp. 1-18 ◽  
Author(s):  
Martijn Sparnaaij ◽  
Dorine C. Duives ◽  
Victor L. Knoop ◽  
Serge P. Hoogendoorn

Ideally, a multitude of steps has to be taken before a commercial implementation of a pedestrian model is used in practice. Calibration, the main goal of which is to increase the accuracy of the predictions by determining the set of values for the model parameters that allows for the best replication of reality, has an important role in this process. Yet, up to recently, calibration has received relatively little attention within the field of pedestrian modelling. Most studies focus only on one specific movement base case and/or use a single metric. It is questionable how generally applicable a pedestrian simulation model is that has been calibrated using a limited set of movement base cases and one metric. The objective of this research is twofold, namely, to (1) determine the effect of the choice of movement base cases, metrics, and density levels on the calibration results and (2) to develop a multiple-objective calibration approach to determine the aforementioned effects. In this paper a multiple-objective calibration scheme is presented for pedestrian simulation models, in which multiple normalized metrics (i.e., flow, spatial distribution, effort, and travel time) are combined by means of weighted sum method that accounts for the stochastic nature of the model. Based on the analysis of the calibration results, it can be concluded that (1) it is necessary to use multiple movement base cases when calibrating a model to capture all relevant behaviours, (2) the level of density influences the calibration results, and (3) the choice of metric or combinations of metrics influence the results severely.

2020 ◽  
Vol 5 ◽  
Author(s):  
Fredrik Johansson

One of the main strengths of microscopic pedestrian simulation models is the ability to explicitly represent the heterogeneity of the pedestrian population. Most pedestrian populations are heterogeneous with respect to the desired speed, and the outputs of microscopic models are naturally sensitive to the desired speed; it has a direct effect on the flow and travel time, thus strongly affecting results that are of interest when applying pedestrian simulation models in practice. An inaccurate desired speed distribution will in most cases lead to inaccurate simulation results. In this paper we propose a method to estimate the desired speed distribution by treating the desired speeds as model parameters to be adjusted in the calibration together with other model parameters. This leads to an optimization problem that is computationally costly to solve for large data sets. We propose a heuristic method to solve this optimization problem by decomposing the original problem in simpler parts that are solved separately. We demonstrate the method on trajectory data from Stockholm central station and analyze the results to conclude that the method is able to produce a plausible desired speed distribution under slightly congested conditions.


Author(s):  
Victor M. Carrillo ◽  
German Almanza

There exist two general approaches to solve multiple objective problems. The first approach belongs to the classical mathematical methods: The weighted sum method, goal programming, or utility functions methods pertain to this approach. The output of mathematical methods is a single optimal solution. In the second approach are the heuristic methods, like the multiple objective evolutionary algorithms that offer the decision maker a set of optimal solutions usually called non- dominated or, Pareto-optimal solutions. This set is usually very large and the decision maker faces the problem of reducing the size of this set to a manageable number of solutions to analyze. In this paper the second approach is used to reduce the Pareto front using two weights generator for the non-numerical ranking preferences method and their performance is compared.


1986 ◽  
Vol 51 (5) ◽  
pp. 1001-1015 ◽  
Author(s):  
Ivan Fořt ◽  
Vladimír Rogalewicz ◽  
Miroslav Richter

The study describes simulation of the motion of bubbles in gas, dispersed by a mechanical impeller in a turbulent low-viscosity liquid flow. The model employs the Monte Carlo method and it is based both on the knowledge of the mean velocity field of mixed liquid (mean motion) and of the spatial distribution of turbulence intensity ( fluctuating motion) in the investigated system - a cylindrical tank with radial baffles at the wall and with a standard (Rushton) turbine impeller in the vessel axis. Motion of the liquid is then superimposed with that of the bubbles in a still environment (ascending motion). The computation of the simulation includes determination of the spatial distribution of the gas holds-up (volumetric concentrations) in the agitated charge as well as of the total gas hold-up system depending on the impeller size and its frequency of revolutions, on the volumetric gas flow rate and the physical properties of gas and liquid. As model parameters, both liquid velocity field and normal gas bubbles distribution characteristics are considered, assuming that the bubbles in the system do not coalesce.


2021 ◽  
Vol 143 (9) ◽  
Author(s):  
Yi-Ping Chen ◽  
Kuei-Yuan Chan

Abstract Simulation models play crucial roles in efficient product development cycles, therefore many studies aim to improve the confidence of a model during the validation stage. In this research, we proposed a dynamic model validation to provide accurate parameter settings for minimal output errors between simulation models and real model experiments. The optimal operations for setting parameters are developed to maximize the effects by specific model parameters while minimizing interactions. To manage the excessive costs associated with simulations of complex systems, we propose a procedure with three main features: the optimal excitation based on global sensitivity analysis (GSA) is done via metamodel techniques, for estimating parameters with the polynomial chaos-based Kalman filter, and validating the updated model based on hypothesis testing. An illustrative mathematical model was used to demonstrate the detail processes in our proposed method. We also apply our method on a vehicle dynamic case with a composite maneuver for exciting unknown model parameters such as inertial and coefficients of the tire model; the unknown model parameters were successfully estimated within a 95% credible interval. The contributions of this research are also underscored through multiple cases.


2017 ◽  
Vol 18 (7) ◽  
pp. 2029-2042
Author(s):  
Tony E. Wong ◽  
William Kleiber ◽  
David C. Noone

Abstract Land surface models are notorious for containing many parameters that control the exchange of heat and moisture between land and atmosphere. Properly modeling the partitioning of total evapotranspiration (ET) between transpiration and evaporation is critical for accurate hydrological modeling, but depends heavily on the treatment of turbulence within and above canopies. Previous work has constrained estimates of evapotranspiration and its partitioning using statistical approaches that calibrate land surface model parameters by assimilating in situ measurements. These studies, however, are silent on the impacts of the accounting of uncertainty within the statistical calibration framework. The present study calibrates the aerodynamic, leaf boundary layer, and stomatal resistance parameters, which partially control canopy turbulent exchange and thus the evapotranspiration flux partitioning. Using an adaptive Metropolis–Hastings algorithm to construct a Markov chain of draws from the joint posterior distribution of these resistance parameters, an ensemble of model realizations is generated, in which latent and sensible heat fluxes and top soil layer temperature are optimized. A set of five calibration experiments demonstrate that model performance is sensitive to the accounting of various sources of uncertainty in the field observations and model output and that it is critical to account for model structural uncertainty. After calibration, the modeled fluxes and top soil layer temperature are largely free from bias, and this calibration approach successfully informs and characterizes uncertainty in these parameters, which is essential for model improvement and development. The key points of this paper are 1) a Markov chain Monte Carlo calibration approach successfully improves modeled turbulent fluxes; 2) ET partitioning estimates hinge on the representation of uncertainties in the model and data; and 3) despite these inherent uncertainties, constrained posterior estimates of ET partitioning emerge.


2018 ◽  
Vol 612 ◽  
pp. A70 ◽  
Author(s):  
J. Olivares ◽  
E. Moraux ◽  
L. M. Sarro ◽  
H. Bouy ◽  
A. Berihuete ◽  
...  

Context. Membership analyses of the DANCe and Tycho + DANCe data sets provide the largest and least contaminated sample of Pleiades candidate members to date. Aims. We aim at reassessing the different proposals for the number surface density of the Pleiades in the light of the new and most complete list of candidate members, and inferring the parameters of the most adequate model. Methods. We compute the Bayesian evidence and Bayes Factors for variations of the classical radial models. These include elliptical symmetry, and luminosity segregation. As a by-product of the model comparison, we obtain posterior distributions for each set of model parameters. Results. We find that the model comparison results depend on the spatial extent of the region used for the analysis. For a circle of 11.5 parsecs around the cluster centre (the most homogeneous and complete region), we find no compelling reason to abandon King’s model, although the Generalised King model introduced here has slightly better fitting properties. Furthermore, we find strong evidence against radially symmetric models when compared to the elliptic extensions. Finally, we find that including mass segregation in the form of luminosity segregation in the J band is strongly supported in all our models. Conclusions. We have put the question of the projected spatial distribution of the Pleiades cluster on a solid probabilistic framework, and inferred its properties using the most exhaustive and least contaminated list of Pleiades candidate members available to date. Our results suggest however that this sample may still lack about 20% of the expected number of cluster members. Therefore, this study should be revised when the completeness and homogeneity of the data can be extended beyond the 11.5 parsecs limit. Such a study will allow for more precise determination of the Pleiades spatial distribution, its tidal radius, ellipticity, number of objects and total mass.


2021 ◽  
Author(s):  
Peter J. Gawthrop ◽  
Michael Pan ◽  
Edmund J. Crampin

AbstractRenewed interest in dynamic simulation models of biomolecular systems has arisen from advances in genome-wide measurement and applications of such models in biotechnology and synthetic biology. In particular, genome-scale models of cellular metabolism beyond the steady state are required in order to represent transient and dynamic regulatory properties of the system. Development of such whole-cell models requires new modelling approaches. Here we propose the energy-based bond graph methodology, which integrates stoichiometric models with thermo-dynamic principles and kinetic modelling. We demonstrate how the bond graph approach intrinsically enforces thermodynamic constraints, provides a modular approach to modelling, and gives a basis for estimation of model parameters leading to dynamic models of biomolecular systems. The approach is illustrated using a well-established stoichiometric model of E. coli and published experimental data.


2020 ◽  
Vol 5 ◽  
Author(s):  
Abdullah Alhawsawi ◽  
Majid Sarvi ◽  
Milad Haghani ◽  
Abbas Rajabifard

Modelling and simulating pedestrian motions are standard ways to investigate crowd dynamics aimed to enhance pedestrians’ safety. Movement of people is affected by interactions with one another and with the physical environment that it may be a worthy line of research. This paper studies the impact of speed on how pedestrians respond to the obstacles (i.e. Obstacles avoidance behaviour). A field experiment was performed in which a group of people were instructed to perform some obstacles avoidance tasks at two levels of normal and high speeds. Trajectories of the participants are extracted from the video recordings for the subsequent intentions:(i) to seek out the impact of total speed, x and yaxis (ii) to observe the impact of the speed on the movement direction, x-axis, (iii) to find out the impact of speed on the lateral direction, y-axis. The results of the experiments could be used to enhance the current pedestrian simulation models.


Sign in / Sign up

Export Citation Format

Share Document