A Method for the Validation of Predictive Computations Using a Stochastic Approach

Author(s):  
Roger Ghanem ◽  
Manuel Pellissetti

The task of model validation deals with quantifying the extent to which predictions from a particular model can be relied upon as representatives of the true behavior of the system being modeled. This issue is of great importance in assessing the reliability and safety of structures since in most cases their quantification relies on predictions from sophisticated probabilistic models. The paper describes a formalism that will extend the realm of the model to include all aspects of data collection and parameter calibration. Error estimators are developed that permit the quantification of the value of computational efforts (mesh refinement) versus analytical efforts (model refinement) and experimental effort (data acquisition and analysis). Starting with the hypothesis that the material properties of a given medium can be modeled within the framework of probability theory, a very rich mathematical setting is available to completely characterize the probabilistic behavior and evolution of the associated random medium under an external disturbance. The probability measure on the material properties is uniquely transformed into a probability measure on the state of the medium. A computational implementation of related concepts has been developed in the framework of stochastic finite elements and applied to a number of problems. Clearly, great value can be attributed to the ability of performing the forward analysis whereby the probability measure on the state of the system is completely characterized by the measure on the material properties. The assumed probability measure on the material properties, however, is greatly dependent on the amount and quality of data used to synthesize this measure. As this measure is updated, estimates of the performance of the underlying natural or physical system change. Significant interest exists therefore in developing the capability of controlling the error in the probabilistic estimates through designed data collection. The mathematical setting adopted in this paper for describing random variables is ideally suited for treating this problem as one of data refinement. A close parallel will be delineated between this concept and that of adaptive mesh refinement, well established in deterministic finite elements. Underlying this latter problem, are issues related to error estimation that are relied upon to guide the adaptation of the refinement. The paper develops a similar “data refinement” concept and highlights the basic underlying principles.

2019 ◽  
Vol 86 (3) ◽  
pp. 56-62
Author(s):  
I. Yu. Yegorov ◽  
V. Yu. Gryga

The article contains the results of a study on monitoring the digital economy and society in the Eastern Partnership countries. A brief assessment of the state of affairs in these countries in terms of the availability of data, which are needed to calculate the main indicators of digitalization and, first of all, the DESI index on the basis of the OECD guidelines and recommendations is presented. The assessment was carried out by comparing the information obtained with similar data, approaches and practices of the European Union. It was based on utilization of the "reference " level of the EU countries for closer alignment with the latter. Based on the results of the analysis of the state of affairs with statistics, which reflects the processes of digitalization in Ukraine and other Eastern partnership countries, some recommendations on improving the organization of information collection in order to increase the quality of statistical data management are made. This also opens the way for a closer coordination with the European approach to monitoring indicators of digitalization, including calculations of DESI index. In Ukraine, work is underway to improve statistical tools for measuring and monitoring digitalization processes. This work is based on the approaches, which are used in the EU countries. In the group of Eastern partnership countries, Ukraine is not among the leaders. It is lagging behind some other countries, first of all - Belarus. At the same time, it should be noted that there are some problems with the use of the Eurostat tools for digitalization measurement. In general, the main problems, which have been mentioned by representatives of statistical bodies in the process of collection, analysis, reporting, are the lack of an appropriate statistical unit, lack of funds for the organization of research, lack of qualified personnel, as well as the lack of appropriate tools for data collection (questionnaires, methodological materials for sampling, etc.). The low level of interest on the part of government bodies exacerbates these problems. The list of DESI indicators must be approved by the government. It should be developed by the State statistical service of Ukraine in cooperation with other ministries and state agencies. Only after that, the State statistical service of Ukraine will be able to change its statistical questionnaires (forms) and conduct the necessary specialized surveys.  EU assistance may include several activities that the State statistical service of Ukraine has identified as its needs: data collection tools, staff training, data analysis, etc. Such assistance may be provided within the framework of existing cooperation agreements between the State statistical service of Ukraine and specialized statistical agencies of the EU.


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Daniel Arndt ◽  
Guido Kanschat

AbstractFinite elements of higher continuity, say conforming in {H^{2}} instead of {H^{1}}, require a mapping from reference cells to mesh cells which is continuously differentiable across cell interfaces. In this article, we propose an algorithm to obtain such mappings given a topologically regular mesh in the standard format of vertex coordinates and a description of the boundary. A variant of the algorithm with orthogonal edges in each vertex is proposed. We introduce necessary modifications in the case of adaptive mesh refinement with nonconforming edges. Furthermore, we discuss efficient storage of the necessary data.


2020 ◽  
Vol 28 (2) ◽  
pp. 63-74
Author(s):  
Joël Chaskalovic ◽  
Franck Assous

AbstractThe aim of this paper is to provide new perspectives on the relative finite elements accuracy. Starting from a geometrical interpretation of the error estimate which can be deduced from Bramble–Hilbert lemma, we derive a probability law that evaluates the relative accuracy, considered as a random variable, between two finite elements Pk and Pm, k < m. We extend this probability law to get a cumulated probabilistic law for two main applications. The first one concerns a family of meshes, the second one is dedicated to a sequence of simplexes constituting a given mesh. Both of these applications could be considered as a first step toward application for adaptive mesh refinement with probabilistic methods.


Sign in / Sign up

Export Citation Format

Share Document