Assessment of the Quality of Latent Variable Calibrations Based on Monte Carlo Simulations

1994 ◽  
Vol 66 (7) ◽  
pp. 937-943 ◽  
Author(s):  
Hans R. Keller ◽  
Juergen. Roettele ◽  
Hermann. Bartels
Author(s):  
Sébastien Fouques ◽  
Ole Andreas Hermundstad

The paper is concerned with the launch of free-fall lifeboats (FFL). It proposes a method that complies with the DNV-OS-E406 standard in order to select characteristic launches from Monte Carlo simulations for further structural load assessment with CFD and FEM. Proxy variables derived from kinematic parameters and aiming at predicting pressure load indicators are computed with the VARUNA launch simulator developed by MARINTEK. The statistical distributions of the proxy variables obtained from the Monte Carlo simulations are used to identify critical scenarios, and characteristic launches can then be selected from a chosen probability level. The feasibility of the proposed method is documented in the paper for several types of pressure loads. Existing model test data from various FFL-launch campaigns in calm water and in waves are used to compute the proxy variables as it would be done in the VARUNA simulator. Scatter diagrams showing the correlation with actual measured pressure load indicators are then established to assess the quality of the chosen proxy variables.


2014 ◽  
Vol 22 (1) ◽  
pp. 45-60 ◽  
Author(s):  
Daniel L. Oberski

Latent variable models can only be compared across groups when these groups exhibit measurement equivalence or “invariance,” since otherwise substantive differences may be confounded with measurement differences. This article suggests examining directly whether measurement differences present could confound substantive analyses, by examining the expected parameter change (EPC)-interest. The EPC-interest approximates the change in parameters of interest that can be expected when freeing cross-group invariance restrictions. Monte Carlo simulations suggest that the EPC-interest approximates these changes well. Three empirical applications show that the EPC-interest can help avoid two undesirable situations: first, it can prevent unnecessarily concluding that groups are incomparable, and second, it alerts the user when comparisons of interest may still be invalidated even when the invariance model appears to fit the data. R code and data for the examples discussed in this article are provided in the electronic appendix (http://hdl.handle.net/1902.1/21816).


1996 ◽  
Vol 07 (03) ◽  
pp. 295-303 ◽  
Author(s):  
P. D. CODDINGTON

Large-scale Monte Carlo simulations require high-quality random number generators to ensure correct results. The contrapositive of this statement is also true — the quality of random number generators can be tested by using them in large-scale Monte Carlo simulations. We have tested many commonly-used random number generators with high precision Monte Carlo simulations of the 2-d Ising model using the Metropolis, Swendsen-Wang, and Wolff algorithms. This work is being extended to the testing of random number generators for parallel computers. The results of these tests are presented, along with recommendations for random number generators for high-performance computers, particularly for lattice Monte Carlo simulations.


2012 ◽  
Vol 1471 ◽  
Author(s):  
Pierre-Emmanuel Berche ◽  
Saoussen Djedai ◽  
Etienne Talbot

ABSTRACTMonte Carlo simulations are used to perform an atomic scale modelling of the magnetic properties of epitaxial exchange-coupled DyFe2/YFe2 superlattices. These samples, extremely well-researched experimentally, are constituted by a hard ferrimagnet DyFe2 and a soft ferrimagnet YFe2 antiferromagnetically coupled. Depending on the layers and on the temperature, the field dependence of the magnetization depth profile is complex. In this work, we reproduce by Monte Carlo simulations hysteresis loops for the net and compound-specific magnetizations at different temperatures, and assess the quality of the results by a direct comparison to experimental hysteresis loops.


2014 ◽  
Vol 38 (5) ◽  
pp. 471-479 ◽  
Author(s):  
Alexander M. Schoemann ◽  
Patrick Miller ◽  
Sunthud Pornprasertmanit ◽  
Wei Wu

Planned missing data designs allow researchers to increase the amount and quality of data collected in a single study. Unfortunately, the effect of planned missing data designs on power is not straightforward. Under certain conditions using a planned missing design will increase power, whereas in other situations using a planned missing design will decrease power. Thus, when designing a study utilizing planned missing data researchers need to perform a power analysis. In this article, we describe methods for power analysis and sample size determination for planned missing data designs using Monte Carlo simulations. We also describe a new, more efficient method of Monte Carlo power analysis, software that can be used in these approaches, and several examples of popular planned missing data designs.


2014 ◽  
Vol 12 ◽  
pp. 75-81 ◽  
Author(s):  
C. Brugger ◽  
S. Weithoffer ◽  
C. de Schryver ◽  
U. Wasenmüller ◽  
N. Wehn

Abstract. Powerful compute clusters and multi-core systems have become widely available in research and industry nowadays. This boost in utilizable computational power tempts people to run compute-intensive tasks on those clusters, either for speed or accuracy reasons. Especially Monte Carlo simulations with their inherent parallelism promise very high speedups. Nevertheless, the quality of Monte Carlo simulations strongly depends on the quality of the employed random numbers. In this work we present a comprehensive analysis of state-of-the-art pseudo random number generators like the MT19937 or the WELL generator used for parallel stream generation in different settings. These random number generators can be realized in hardware as well as in software and help to accelerate the analysis (or simulation) of communications systems. We show that it is possible to generate high-quality parallel random number streams with both generators, as long as some configuration constraints are met. We furthermore depict that distributed simulations with those generator types are viable even to very high degrees of parallelism.


2014 ◽  
Vol 12 (2) ◽  
pp. 229
Author(s):  
Marco Aurélio Dos Santos Sanfins ◽  
Danilo Soares Monte-Mor

Given the recent international crises and the increasing number of defaults, several researchers have attempted to develop metrics that calculate the probability of insolvency with higher accuracy. The approaches commonly used, however, do not consider the credit risk nor the severity of the distance between receivables and obligations among different periods. In this paper we mathematically present an approach that allow us to estimate the insolvency risk by considering not only future receivables and obligations, but the severity of the distance between them and the quality of the respective receivables. Using Monte Carlo simulations and hypothetical examples, we show that our metric is able to estimate the insolvency risk with high accuracy. Moreover, our results suggest that in the absence of a smooth distribution between receivables and obligations, there is a non-null insolvency risk even when the present value of receivables is larger than the present value of the obligations.


2021 ◽  
Vol 95 (5) ◽  
Author(s):  
Matthias Schartner ◽  
Christian Plötz ◽  
Benedikt Soja

AbstractWithin this work, a new geodetic very long baseline interferometry (VLBI) scheduling approach inspired by evolutionary processes based on selection, crossover and mutation is presented. It mimics the biological concept “surviving of the fittest” to iteratively explore the scheduling parameter space looking for the best solution. Besides providing high-quality results, one main benefit of the proposed approach is that it enables the generation of fully automated and individually optimized schedules. Moreover, it generates schedules based on transparent rules, well-defined scientific goals and by making decisions based on Monte Carlo simulations. The improvements in terms of precision of geodetic parameters are discussed for various observing programs organized by the International VLBI Service for Geodesy and Astrometry (IVS), such as the OHG, R1, and T2 programs. In the case of schedules with a difficult telescope network, an improvement in the precision of the geodetic parameters up to 15% could be identified, as well as an increase in the number of observations of up to 10% compared to classical scheduling approaches. Due to the high quality of the produced schedules and the reduced workload for the schedulers, various IVS observing programs are already making use of the evolutionary parameter selection, such as the AUA, INT2, INT3, INT9, OHG, T2 and VGOS-B program.


Sign in / Sign up

Export Citation Format

Share Document