scholarly journals The FMM Approach to Analyze Biomedical Signals: Theory, Software, Applications and Future

Mathematics ◽  
2021 ◽  
Vol 9 (10) ◽  
pp. 1145
Author(s):  
Cristina Rueda ◽  
Itziar Fernández ◽  
Yolanda Larriba ◽  
Alejandro Rodríguez-Collado

Oscillatory systems arise in the different biological and medical fields. Mathematical and statistical approaches are fundamental to deal with these processes. The Frequency Modulated Mobiüs approach (FMM), reviewed in this paper, is one of these approaches. Little known as it has been recently developed, it solves a variety of exciting questions with real data; some of them, such as the decomposition of the signal into components and their multiple uses, are of general application, others are specific. Among the exciting specific applications is the automatic interpretation of the electrocardiogram signal. In this paper, a summary of the theoretical, statistical and computational properties of the FMM approach are revised. Additionally, as a novelty, the FMM approach’s usefulness for the analysis of blood pressure signals is shown. For the latter, a new robust estimation algorithm is proposed using FMM models with restrictions. The paper ends with a view about challenges for the future.

Author(s):  
Flah. Aymen ◽  
Habib Kraiem ◽  
Sbita. Lassaâd

In this chapter, two computational algorithms are proposed and applied on an estimation algorithm, in order to improve the global performance of the estimation phase. The proposed system is studied based on the Model Reference Adaptive System (MRAS). The importance of the estimation phase in a large applications number is basically observed on the applications applied on electrical motors, where a lot number of parameters are measured with real measurement equipments, as Tesla Meter, speed shaft, and others. The idea is based generally on the software applications, where it is possible to guarantee the desired estimation phase using a software algorithm. In this chapter the MRAS technique is proposed as the software algorithm, for replacing the measurement materials for online estimate the overall characteristic PMSM parameters. Our approach aims to ameliorate the MRAS technique with intelligent optimization methods called BFO and PSO.


2020 ◽  
Vol 12 (18) ◽  
pp. 2923
Author(s):  
Tengfei Zhou ◽  
Xiaojun Cheng ◽  
Peng Lin ◽  
Zhenlun Wu ◽  
Ensheng Liu

Due to the existence of environmental or human factors, and because of the instrument itself, there are many uncertainties in point clouds, which directly affect the data quality and the accuracy of subsequent processing, such as point cloud segmentation, 3D modeling, etc. In this paper, to address this problem, stochastic information of point cloud coordinates is taken into account, and on the basis of the scanner observation principle within the Gauss–Helmert model, a novel general point-based self-calibration method is developed for terrestrial laser scanners, incorporating both five additional parameters and six exterior orientation parameters. For cases where the instrument accuracy is different from the nominal ones, the variance component estimation algorithm is implemented for reweighting the outliers after the residual errors of observations obtained. Considering that the proposed method essentially is a nonlinear model, the Gauss–Newton iteration method is applied to derive the solutions of additional parameters and exterior orientation parameters. We conducted experiments using simulated and real data and compared them with those two existing methods. The experimental results showed that the proposed method could improve the point accuracy from 10−4 to 10−8 (a priori known) and 10−7 (a priori unknown), and reduced the correlation among the parameters (approximately 60% of volume). However, it is undeniable that some correlations increased instead, which is the limitation of the general method.


2021 ◽  
Author(s):  
Bataa Lkhagvasuren ◽  
Minkyu Kwak ◽  
Hong Sung Jin ◽  
Gyuwon Seo ◽  
Sungyool Bong ◽  
...  

<div>This paper proposes a new window-wise state of charge (SOC) estimation algorithm based on Kalman filters (KF). In the first stage, the equivalent circuit model's parameters are estimated by a least square estimation window-wise, assuming a linear SOC and open-circuit voltage (OCV) relation. The algorithm accurately estimates the parameters and observes the changes that depend on SOC. Moreover, based on the estimated parameters, the OCV values are identified. In the next stage, window-wise linear Kalman filter(ES-LKF) without hysteresis and extended Kalman filter (ES-EKF) and sigma-point Kalman filter (ES-SPKF) algorithm with hysteresis are executed to estimate SOC. Having fewer state equations and hysteresis parameters tuned up in an off-line way, the ES-EKF and ES-SPKF perform better than the algorithms considered in previous works. The algorithms are validated by experiments with real data obtained from lab tests.</div>


2021 ◽  
Author(s):  
Bataa Lkhagvasuren ◽  
Minkyu Kwak ◽  
Hong Sung Jin ◽  
Gyuwon Seo ◽  
Sungyool Bong ◽  
...  

<div>This paper proposes a new window-wise state of charge (SOC) estimation algorithm based on Kalman filters (KF). In the first stage, the equivalent circuit model's parameters are estimated by a least square estimation window-wise, assuming a linear SOC and open-circuit voltage (OCV) relation. The algorithm accurately estimates the parameters and observes the changes that depend on SOC. Moreover, based on the estimated parameters, the OCV values are identified. In the next stage, window-wise linear Kalman filter(ES-LKF) without hysteresis and extended Kalman filter (ES-EKF) and sigma-point Kalman filter (ES-SPKF) algorithm with hysteresis are executed to estimate SOC. Having fewer state equations and hysteresis parameters tuned up in an off-line way, the ES-EKF and ES-SPKF perform better than the algorithms considered in previous works. The algorithms are validated by experiments with real data obtained from lab tests.</div>


Author(s):  
Marti´n Di Blasi ◽  
Renan Martins Baptista ◽  
Carlos Muravchik

A novel leak localization method for multi section pipelines is presented. Based on normal operation flowing thermodynamic pressure drop patterns along the pipeline, the system continuously compares with the measured pressure drops, and makes a decision based on the best fit finding the section where the leak occurs. A statistical approach is used accounting for noisy measured signals. The method uses steady state fluid equations, a recursive parameter estimation algorithm, and statistical decision and pattern recognition techniques. A modification is introduced to consider the cost of making a wrong leaky section choice in terms of the excess volume spilled due to gravitational flow after pipeline shut down. This leads to a Bayesian decision scheme minimizing a risk functional. The costs are the spill volumes, obtained from dynamical simulation of the pipeline, under the various possible decision scenarios. Finally, details are given of the successful implementation of the system on a 500km long oil pipeline, and real data from a simulated leak experiment are shown.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Weisan Wu

AbstractThe protection of private data is a hot research issue in the era of big data. Differential privacy is a strong privacy guarantees in data analysis. In this paper, we propose DP-MSNM, a parametric density estimation algorithm using multivariate skew-normal mixtures (MSNM) model to differential privacy. MSNM can solve the asymmetric problem of data sets, and it is could approximate any distribution through expectation–maximization (EM) algorithm. In this model, we add two extra steps on the estimated parameters in the M step of each iteration. The first step is adding calibrated noise to the estimated parameters based on Laplacian mechanism. The second step is post-processes those noisy parameters to ensure their intrinsic characteristics based on the theory of vector normalize and positive semi definition matrix. Extensive experiments using both real data sets evaluate the performance of DP-MSNM, and demonstrate that the proposed method outperforms DPGMM.


2021 ◽  
Vol 13 (4) ◽  
pp. 713
Author(s):  
Xuanwen Tao ◽  
Mercedes E. Paoletti ◽  
Juan M. Haut ◽  
Peng Ren ◽  
Javier Plaza ◽  
...  

Endmember estimation plays a key role in hyperspectral image unmixing, often requiring an estimation of the number of endmembers and extracting endmembers. However, most of the existing extraction algorithms require prior knowledge regarding the number of endmembers, being a critical process during unmixing. To bridge this, a new maximum distance analysis (MDA) method is proposed that simultaneously estimates the number and spectral signatures of endmembers without any prior information on the experimental data containing pure pixel spectral signatures and no noise, being based on the assumption that endmembers form a simplex with the greatest volume over all pixel combinations. The simplex includes the farthest pixel point from the coordinate origin in the spectral space, which implies that: (1) the farthest pixel point from any other pixel point must be an endmember, (2) the farthest pixel point from any line must be an endmember, and (3) the farthest pixel point from any plane (or affine hull) must be an endmember. Under this scenario, the farthest pixel point from the coordinate origin is the first endmember, being used to create the aforementioned point, line, plane, and affine hull. The remaining endmembers are extracted by repetitively searching for the pixel points that satisfy the above three assumptions. In addition to behaving as an endmember estimation algorithm by itself, the MDA method can co-operate with existing endmember extraction techniques without the pure pixel assumption via generalizing them into more effective schemes. The conducted experiments validate the effectiveness and efficiency of our method on synthetic and real data.


Geophysics ◽  
2005 ◽  
Vol 70 (3) ◽  
pp. P13-P18 ◽  
Author(s):  
Wenkai Lu ◽  
Yandong Li ◽  
Shanwen Zhang ◽  
Huanqin Xiao ◽  
Yanda Li

This article proposes a new higher-order-statistics-based coherence-estimation algorithm, which we denote as HOSC. Unlike the traditional crosscorrelation-based C1 coherence algorithm, which sequentially estimates correlation in the inline and crossline directions and uses their geometric mean as a coherence estimate at the analysis point, our method exploits three seismic traces simultaneously to calculate a 2D slice of their normalized fourth-order moment with one zero-lag correlation and then searches for the maximum correlation point on the 2D slice as the coherence estimate. To include more seismic traces in the coherence estimation, we introduce a supertrace technique that constructs a new data cube by rearranging several adjacent seismic traces into a single supertrace. Combining our supertrace technique with the C1 and HOSC algorithms, we obtain two efficient coherence-estimation algorithms, which we call ST-C1 and ST-HOSC. Application results on the real data set show that our algorithms are able to reveal more details about the structural and stratigraphic features than the traditional C1 algorithm, yet still preserve its computational efficiency.


2012 ◽  
Vol 5 (5) ◽  
pp. 1099-1119 ◽  
Author(s):  
S. Sanghavi ◽  
J. V. Martonchik ◽  
J. Landgraf ◽  
U. Platt

Abstract. Due to the well-defined vertical profile of O2 in the atmosphere, the strong A-band (757–774 nm) has long been used to estimate vertical distributions of aerosol/cloud from space. We extend this approach to include part of the O2 B-band (684–688 nm) as well. SCIAMACHY onboard ENVISAT is the first instrument to provide spectral data at moderate resolution (0.2–1.5 nm) in the UV/VIS/NIR including both the O2 A- and B-bands. Using SCIAMACHY specifications, we make combined use of these bands in an optimal estimation algorithm. Theoretical studies show that our algorithm is applicable both over bright and dark surfaces for the retrieval of a lognormal approximation of the vertical profile of particulate matter, in addition to its optical thickness. Synthetic studies and information content analyses prove that such a combined use provides additional information on the vertical distribution of atmospheric scatterers, attributable to differences in the absorption strengths of the two bands and their underlying surface albedos. Due to the high computational cost of the retrieval, we restrict application to real data to a case study over Kanpur through the year 2003. Comparison with AERONET data shows a commonly observed seasonal pattern of haziness, manifesting a correlation coefficient of r = 0.92 for non-monsoon monthly mean AOTs. The retrieved particulate optical thickness is found to be anti-correlated with the relative contrast of the Lambertian equivalent reflectivity (LER) at 682 nm and 755 nm by a coefficient of 0.788, confirming the hypothesis made in Sanghavi et al. (2010). Our case study demonstrates a stable physics-based retrieval of particulate matter using only SCIAMACHY data. The feasibility of our approach is enhanced by the information provided by measurements around the O2 B-band in addition to the A-band. Nonetheless, operational application to SCIAMACHY data remains challenged by radiometric uncertainties, yielding simultaneous retrieval of particle microphysical parameters impracticable and leading to over-reliance on climatological data. Addressing these issues in future instruments similar to SCIAMACHY, coupled with computational resources and speed-up of the current line-by-line radiative transfer calculations, can allow our approach to be extended to the global scale, particularly as it is not limited to dark surfaces.


2021 ◽  
Vol 21 (6) ◽  
pp. 4249-4265
Author(s):  
Gemine Vivone ◽  
Giuseppe D'Amico ◽  
Donato Summa ◽  
Simone Lolli ◽  
Aldo Amodeo ◽  
...  

Abstract. The atmospheric boundary layer (ABL) represents the lowermost part of the atmosphere directly in contact with the Earth's surface. The estimation of its depth is of crucial importance in meteorology and for anthropogenic pollution studies. ABL height (ABLH) measurements are usually far from being adequate, both spatially and temporally. Thus, different remote sensing sources can be of great help in growing both the spatial and temporal ABLH measurement capabilities. To this aim, aerosol backscatter profiles are widely used as a proxy to retrieve the ABLH. Hence, the scientific community is making remarkable efforts in developing automatic ABLH retrieval algorithms applied to lidar observations. In this paper, we propose a ABLH estimation algorithm based on image processing techniques applied to the composite image of the total attenuated backscatter coefficient. A pre-processing step is applied to the composite total backscatter image based on morphological filters to properly set-up and adjust the image to detect edges. As final step, the detected edges are post-processed through both mathematical morphology and an object-based analysis. The performance of the proposed approach is assessed on real data acquired by two different lidar systems, deployed in Potenza (Italy) and Évora (Portugal), belonging to the European Aerosol Research Lidar Network (EARLINET). The proposed approach has shown higher performance than the benchmark consisting of some state-of-the-art ABLH estimation methods.


Sign in / Sign up

Export Citation Format

Share Document