scholarly journals Non-Extensive Statistical Analysis of Acoustic Emissions: The Variability of Entropic Index q during Loading of Brittle Materials until Fracture

Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 276
Author(s):  
Andronikos Loukidis ◽  
Dimos Triantis ◽  
Ilias Stavrakas

Non-extensive statistical mechanics (NESM), introduced by Tsallis based on the principle of non-additive entropy, is a generalisation of the Boltzmann–Gibbs statistics. NESM has been shown to provide the necessary theoretical and analytical implementation for studying complex systems such as the fracture mechanisms and crack evolution processes that occur in mechanically loaded specimens of brittle materials. In the current work, acoustic emission (AE) data recorded when marble and cement mortar specimens were subjected to three distinct loading protocols until fracture, are discussed in the context of NESM. The NESM analysis showed that the cumulative distribution functions of the AE interevent times (i.e., the time interval between successive AE hits) follow a q-exponential function. For each examined specimen, the corresponding Tsallis entropic q-indices and the parameters βq and τq were calculated. The entropic index q shows a systematic behaviour strongly related to the various stages of the implemented loading protocols for all the examined specimens. Results seem to support the idea of using the entropic index q as a potential pre-failure indicator for the impending catastrophic fracture of the mechanically loaded specimens.

Entropy ◽  
2020 ◽  
Vol 22 (10) ◽  
pp. 1115
Author(s):  
Andronikos Loukidis ◽  
Dimos Triantis ◽  
Ilias Stavrakas

Non-extensive statistical mechanics (NESM), which is a generalization of the traditional Boltzmann-Gibbs statistics, constitutes a theoretical and analytical tool for investigating the irreversible damage evolution processes and fracture mechanisms occurring when materials are subjected to mechanical loading. In this study, NESM is used for the analysis of the acoustic emission (AE) events recorded when marble and cement mortar specimens were subjected to mechanical loading until fracture. In total, AE data originating from four distinct loading protocols are presented. The cumulative distribution of inter-event times (time interval between two consecutive AE events) and the inter-event distances (three-dimensional Euclidian distance between the centers of successive AE events) were examined under the above concept and it was found that NESM is suitable to detect criticality under the terms of mechanical status of a material. This was conducted by evaluating the fitting results of the q-exponential function and the corresponding q-indices of Tsallis entropy qδτ and qδr, along with the parameters τδτ and dδr. Results support that qδτ+qδr≈2 for AE data recorded from marble and cement mortar specimens of this work, which is in good agreement with the conjecture previously found in seismological data and AE data recorded from Basalt specimens.


2020 ◽  
Vol 501 (1) ◽  
pp. 994-1001
Author(s):  
Suman Sarkar ◽  
Biswajit Pandey ◽  
Snehasish Bhattacharjee

ABSTRACT We use an information theoretic framework to analyse data from the Galaxy Zoo 2 project and study if there are any statistically significant correlations between the presence of bars in spiral galaxies and their environment. We measure the mutual information between the barredness of galaxies and their environments in a volume limited sample (Mr ≤ −21) and compare it with the same in data sets where (i) the bar/unbar classifications are randomized and (ii) the spatial distribution of galaxies are shuffled on different length scales. We assess the statistical significance of the differences in the mutual information using a t-test and find that both randomization of morphological classifications and shuffling of spatial distribution do not alter the mutual information in a statistically significant way. The non-zero mutual information between the barredness and environment arises due to the finite and discrete nature of the data set that can be entirely explained by mock Poisson distributions. We also separately compare the cumulative distribution functions of the barred and unbarred galaxies as a function of their local density. Using a Kolmogorov–Smirnov test, we find that the null hypothesis cannot be rejected even at $75{{\ \rm per\ cent}}$ confidence level. Our analysis indicates that environments do not play a significant role in the formation of a bar, which is largely determined by the internal processes of the host galaxy.


2021 ◽  
Vol 13 (6) ◽  
pp. 1096
Author(s):  
Soi Ahn ◽  
Sung-Rae Chung ◽  
Hyun-Jong Oh ◽  
Chu-Yong Chung

This study aimed to generate a near real time composite of aerosol optical depth (AOD) to improve predictive model ability and provide current conditions of aerosol spatial distribution and transportation across Northeast Asia. AOD, a proxy for aerosol loading, is estimated remotely by various spaceborne imaging sensors capturing visible and infrared spectra. Nevertheless, differences in satellite-based retrieval algorithms, spatiotemporal resolution, sampling, radiometric calibration, and cloud-screening procedures create significant variability among AOD products. Satellite products, however, can be complementary in terms of their accuracy and spatiotemporal comprehensiveness. Thus, composite AOD products were derived for Northeast Asia based on data from four sensors: Advanced Himawari Imager (AHI), Geostationary Ocean Color Imager (GOCI), Moderate Infrared Spectroradiometer (MODIS), and Visible Infrared Imaging Radiometer Suite (VIIRS). Cumulative distribution functions were employed to estimate error statistics using measurements from the Aerosol Robotic Network (AERONET). In order to apply the AERONET point-specific error, coefficients of each satellite were calculated using inverse distance weighting. Finally, the root mean square error (RMSE) for each satellite AOD product was calculated based on the inverse composite weighting (ICW). Hourly AOD composites were generated (00:00–09:00 UTC, 2017) using the regression equation derived from the comparison of the composite AOD error statistics to AERONET measurements, and the results showed that the correlation coefficient and RMSE values of composite were close to those of the low earth orbit satellite products (MODIS and VIIRS). The methodology and the resulting dataset derived here are relevant for the demonstrated successful merging of multi-sensor retrievals to produce long-term satellite-based climate data records.


Author(s):  
Rama Subba Reddy Gorla

Heat transfer from a nuclear fuel rod bumper support was computationally simulated by a finite element method and probabilistically evaluated in view of the several uncertainties in the performance parameters. Cumulative distribution functions and sensitivity factors were computed for overall heat transfer rates due to the thermodynamic random variables. These results can be used to identify quickly the most critical design variables in order to optimize the design and to make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in heat transfer and to the identification of both the most critical measurements and the parameters.


2020 ◽  
Vol 2020 (1) ◽  
Author(s):  
Thabet Abdeljawad ◽  
Saima Rashid ◽  
Zakia Hammouch ◽  
İmdat İşcan ◽  
Yu-Ming Chu

Abstract The present article addresses the concept of p-convex functions on fractal sets. We are able to prove a novel auxiliary result. In the application aspect, the fidelity of the local fractional is used to establish the generalization of Simpson-type inequalities for the class of functions whose local fractional derivatives in absolute values at certain powers are p-convex. The method we present is an alternative in showing the classical variants associated with generalized p-convex functions. Some parts of our results cover the classical convex functions and classical harmonically convex functions. Some novel applications in random variables, cumulative distribution functions and generalized bivariate means are obtained to ensure the correctness of the present results. The present approach is efficient, reliable, and it can be used as an alternative to establishing new solutions for different types of fractals in computer graphics.


2002 ◽  
Vol 4 (3) ◽  
pp. 183-190 ◽  
Author(s):  
W. Hitzl ◽  
G. Grabner

The comparison of different methods of keratoprosthesis (KP) regarding their long-term success, as far as visual acuity is concerned, is difficult: this is the case both as a standardized reporting method agreed upon by all research groups has not been reported and far less accepted, and as the quality of life for the patient not only depends on the level of visual acuity, but also quite significantly on the “survival time” of the implant. Therefore, an analysis of a single series of patients with Osteo–Odonto–Keratoprosthesis (OOKP) was performed. Statistical analysis methods used by others in similar groups of surgical procedures have included descriptive statistics, survival analysis and ANOVA. These methods comprised comparisons of empirical densities or distribution functions and empirical survival curves. It is the objective of this paper to provide an inductive statistical method to avoid the problems with descriptive techniques and survival analysis. This statistical model meets four important standards: (1) the efficiency of a surgical technique can be assessed within an arbitrary time interval by a new index (VAT-index), (2) possible autocorrelations of the data are taken into consideration and (3) the efficiency is not only stated by a point estimator, but also 95% point-wise confidence limits are computed based on the Monte Carlo method, and finally, (4) the efficiency of a specific method is illustrated by line and range plots for quick illustration and can also be used for the comparison of different other surgical techniques such as refractive techniques, glaucoma and retinal surgery.


2011 ◽  
Vol 18 (2) ◽  
pp. 223-234 ◽  
Author(s):  
R. Haas ◽  
K. Born

Abstract. In this study, a two-step probabilistic downscaling approach is introduced and evaluated. The method is exemplarily applied on precipitation observations in the subtropical mountain environment of the High Atlas in Morocco. The challenge is to deal with a complex terrain, heavily skewed precipitation distributions and a sparse amount of data, both spatial and temporal. In the first step of the approach, a transfer function between distributions of large-scale predictors and of local observations is derived. The aim is to forecast cumulative distribution functions with parameters from known data. In order to interpolate between sites, the second step applies multiple linear regression on distribution parameters of observed data using local topographic information. By combining both steps, a prediction at every point of the investigation area is achieved. Both steps and their combination are assessed by cross-validation and by splitting the available dataset into a trainings- and a validation-subset. Due to the estimated quantiles and probabilities of zero daily precipitation, this approach is found to be adequate for application even in areas with difficult topographic circumstances and low data availability.


Author(s):  
Aniruddha Choudhary ◽  
Ian T. Voyles ◽  
Christopher J. Roy ◽  
William L. Oberkampf ◽  
Mayuresh Patil

Our approach to the Sandia Verification and Validation Challenge Problem is to use probability bounds analysis (PBA) based on probabilistic representation for aleatory uncertainties and interval representation for (most) epistemic uncertainties. The nondeterministic model predictions thus take the form of p-boxes, or bounding cumulative distribution functions (CDFs) that contain all possible families of CDFs that could exist within the uncertainty bounds. The scarcity of experimental data provides little support for treatment of all uncertain inputs as purely aleatory uncertainties and also precludes significant calibration of the models. We instead seek to estimate the model form uncertainty at conditions where the experimental data are available, then extrapolate this uncertainty to conditions where no data exist. The modified area validation metric (MAVM) is employed to estimate the model form uncertainty which is important because the model involves significant simplifications (both geometric and physical nature) of the true system. The results of verification and validation processes are treated as additional interval-based uncertainties applied to the nondeterministic model predictions based on which the failure prediction is made. Based on the method employed, we estimate the probability of failure to be as large as 0.0034, concluding that the tanks are unsafe.


Sign in / Sign up

Export Citation Format

Share Document