scholarly journals On the Analysis of the Phase Unwrapping Process in a D-InSAR Stack with Special Focus on the Estimation of a Motion Model

2019 ◽  
Vol 11 (19) ◽  
pp. 2295 ◽  
Author(s):  
Christina Esch ◽  
Joël Köhler ◽  
Karlheinz Gutjahr ◽  
Wolf-Dieter Schuh

This paper analyses the critical phase unwrapping step in a differential interferometric phase (D-InSAR) stack where both the solving of conventional methods and alternative approaches are discussed. It can be shown that including the temporal relationship between interferograms in the phase unwrapping step improves the results. This leads to the three-dimensional extended minimum cost flow algorithm. To unwrap the phase in a multitemporal way a motion model has to be considered. The estimation of these parameters is an important step. By default, the parameters are estimated in an iterative search process, where in each step, a linear program has to be solved. The best parameters are defined by the minimal costs. Often the choice of this search space is not straightforward. Furthermore, with this discrete optimization function, the solution is often not unique. This paper presents an alternative way to estimate the motion model parameters by maximizing a continuous function, the ensemble phase coherence. With the help of a closed-loop simulation and real data, both methods, the standard and the alternative way, are numerically compared and analyzed. Consequently, it is shown that maximizing the ensemble phase coherence is a good alternative to the established iterative procedure. It offers the advantage that the run time can be reduced considerably and is thus well suited in the processing of large data sets.

2018 ◽  
Vol 7 (4) ◽  
pp. 57 ◽  
Author(s):  
Jehhan. A. Almamy ◽  
Mohamed Ibrahim ◽  
M. S. Eliwa ◽  
Saeed Al-mualim ◽  
Haitham M. Yousof

In this work, we study the two-parameter Odd Lindley Weibull lifetime model. This distribution is motivated by the wide use of the Weibull model in many applied areas and also for the fact that this new generalization provides more flexibility to analyze real data. The Odd Lindley Weibull density function can be written as a linear combination of the exponentiated Weibull densities. We derive explicit expressions for the ordinary and incomplete moments, moments of the (reversed) residual life, generating functions and order statistics. We discuss the maximum likelihood estimation of the model parameters. We assess the performance of the maximum likelihood estimators in terms of biases, variances, mean squared of errors by means of a simulation study. The usefulness of the new model is illustrated by means of two real data sets. The new model provides consistently better fits than other competitive models for these data sets. The Odd Lindley Weibull lifetime model is much better than \ Weibull, exponential Weibull, Kumaraswamy Weibull, beta Weibull, and the three parameters odd lindly Weibull with three parameters models so the Odd Lindley Weibull model is a good alternative to these models in modeling glass fibres data as well as the Odd Lindley Weibull model is much better than the Weibull, Lindley Weibull transmuted complementary Weibull geometric and beta Weibull models so it is a good alternative to these models in modeling time-to-failure data.


2020 ◽  
Vol 12 (9) ◽  
pp. 1473 ◽  
Author(s):  
Christina Esch ◽  
Joël Köhler ◽  
Karlheinz Gutjahr ◽  
Wolf-Dieter Schuh

One of the most critical steps in a multitemporal D-InSAR analysis is the resolution of the phase ambiguities in the context of phase unwrapping. The Extended Minimum Cost Flow approach is one of the potential phase unwrapping algorithms used in the Small Baseline Subset analysis. In a first step, each phase gradient is unwrapped in time using a linear motion model and, in a second step, the spatial phase unwrapping is individually performed for each interferogram. Exploiting the temporal and spatial information is a proven method, but the two-step procedure is not optimal. In this paper, a method is presented which solves both the temporal and spatial phase unwrapping in one single step. This requires some modifications regarding the estimation of the motion model and the choice of the weights. Furthermore, the problem of temporal inconsistency of the data, which occurs with spatially filtered interferograms, must be considered. For this purpose, so called slack variables are inserted. To verify the method, both simulated and real data are used. The test region is the Lower-Rhine-Embayment in the southwest of North Rhine-Westphalia, a very rural region with noisy data. The studies show that the new approach leads to more consistent results, so that the deformation time series of the analyzed pixels can be improved.


2019 ◽  
Vol XVI (2) ◽  
pp. 1-11
Author(s):  
Farrukh Jamal ◽  
Hesham Mohammed Reyad ◽  
Soha Othman Ahmed ◽  
Muhammad Akbar Ali Shah ◽  
Emrah Altun

A new three-parameter continuous model called the exponentiated half-logistic Lomax distribution is introduced in this paper. Basic mathematical properties for the proposed model were investigated which include raw and incomplete moments, skewness, kurtosis, generating functions, Rényi entropy, Lorenz, Bonferroni and Zenga curves, probability weighted moment, stress strength model, order statistics, and record statistics. The model parameters were estimated by using the maximum likelihood criterion and the behaviours of these estimates were examined by conducting a simulation study. The applicability of the new model is illustrated by applying it on a real data set.


Mathematics ◽  
2021 ◽  
Vol 9 (16) ◽  
pp. 1850
Author(s):  
Rashad A. R. Bantan ◽  
Farrukh Jamal ◽  
Christophe Chesneau ◽  
Mohammed Elgarhy

Unit distributions are commonly used in probability and statistics to describe useful quantities with values between 0 and 1, such as proportions, probabilities, and percentages. Some unit distributions are defined in a natural analytical manner, and the others are derived through the transformation of an existing distribution defined in a greater domain. In this article, we introduce the unit gamma/Gompertz distribution, founded on the inverse-exponential scheme and the gamma/Gompertz distribution. The gamma/Gompertz distribution is known to be a very flexible three-parameter lifetime distribution, and we aim to transpose this flexibility to the unit interval. First, we check this aspect with the analytical behavior of the primary functions. It is shown that the probability density function can be increasing, decreasing, “increasing-decreasing” and “decreasing-increasing”, with pliant asymmetric properties. On the other hand, the hazard rate function has monotonically increasing, decreasing, or constant shapes. We complete the theoretical part with some propositions on stochastic ordering, moments, quantiles, and the reliability coefficient. Practically, to estimate the model parameters from unit data, the maximum likelihood method is used. We present some simulation results to evaluate this method. Two applications using real data sets, one on trade shares and the other on flood levels, demonstrate the importance of the new model when compared to other unit models.


2021 ◽  
Vol 11 (15) ◽  
pp. 6998
Author(s):  
Qiuying Li ◽  
Hoang Pham

Many NHPP software reliability growth models (SRGMs) have been proposed to assess software reliability during the past 40 years, but most of them have focused on modeling the fault detection process (FDP) in two ways: one is to ignore the fault correction process (FCP), i.e., faults are assumed to be instantaneously removed after the failure caused by the faults is detected. However, in real software development, it is not always reliable as fault removal usually needs time, i.e., the faults causing failures cannot always be removed at once and the detected failures will become more and more difficult to correct as testing progresses. Another way to model the fault correction process is to consider the time delay between the fault detection and fault correction. The time delay has been assumed to be constant and function dependent on time or random variables following some kind of distribution. In this paper, some useful approaches to the modeling of dual fault detection and correction processes are discussed. The dependencies between fault amounts of dual processes are considered instead of fault correction time-delay. A model aiming to integrate fault-detection processes and fault-correction processes, along with the incorporation of a fault introduction rate and testing coverage rate into the software reliability evaluation is proposed. The model parameters are estimated using the Least Squares Estimation (LSE) method. The descriptive and predictive performance of this proposed model and other existing NHPP SRGMs are investigated by using three real data-sets based on four criteria, respectively. The results show that the new model can be significantly effective in yielding better reliability estimation and prediction.


Author(s):  
Fabio Sabetta ◽  
Antonio Pugliese ◽  
Gabriele Fiorentino ◽  
Giovanni Lanzano ◽  
Lucia Luzi

AbstractThis work presents an up-to-date model for the simulation of non-stationary ground motions, including several novelties compared to the original study of Sabetta and Pugliese (Bull Seism Soc Am 86:337–352, 1996). The selection of the input motion in the framework of earthquake engineering has become progressively more important with the growing use of nonlinear dynamic analyses. Regardless of the increasing availability of large strong motion databases, ground motion records are not always available for a given earthquake scenario and site condition, requiring the adoption of simulated time series. Among the different techniques for the generation of ground motion records, we focused on the methods based on stochastic simulations, considering the time- frequency decomposition of the seismic ground motion. We updated the non-stationary stochastic model initially developed in Sabetta and Pugliese (Bull Seism Soc Am 86:337–352, 1996) and later modified by Pousse et al. (Bull Seism Soc Am 96:2103–2117, 2006) and Laurendeau et al. (Nonstationary stochastic simulation of strong ground-motion time histories: application to the Japanese database. 15 WCEE Lisbon, 2012). The model is based on the S-transform that implicitly considers both the amplitude and frequency modulation. The four model parameters required for the simulation are: Arias intensity, significant duration, central frequency, and frequency bandwidth. They were obtained from an empirical ground motion model calibrated using the accelerometric records included in the updated Italian strong-motion database ITACA. The simulated accelerograms show a good match with the ground motion model prediction of several amplitude and frequency measures, such as Arias intensity, peak acceleration, peak velocity, Fourier spectra, and response spectra.


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Ali Kargarnejad ◽  
Mohsen Taherbaneh ◽  
Amir Hosein Kashefi

Tracking maximum power point of a solar panel is of interest in most of photovoltaic applications. Solar panel modeling is also very interesting exclusively based on manufacturers data. Knowing that the manufacturers generally give the electrical specifications of their products at one operating condition, there are so many cases in which the specifications in other conditions are of interest. In this research, a comprehensive one-diode model for a solar panel with maximum obtainable accuracy is fully developed only based on datasheet values. The model parameters dependencies on environmental conditions are taken into consideration as much as possible. Comparison between real data and simulations results shows that the proposed model has maximum obtainable accuracy. Then a new fuzzy-based controller to track the maximum power point of the solar panel is also proposed which has better response from speed, accuracy and stability point of view respect to the previous common developed one.


Geophysics ◽  
2016 ◽  
Vol 81 (4) ◽  
pp. U25-U38 ◽  
Author(s):  
Nuno V. da Silva ◽  
Andrew Ratcliffe ◽  
Vetle Vinje ◽  
Graham Conroy

Parameterization lies at the center of anisotropic full-waveform inversion (FWI) with multiparameter updates. This is because FWI aims to update the long and short wavelengths of the perturbations. Thus, it is important that the parameterization accommodates this. Recently, there has been an intensive effort to determine the optimal parameterization, centering the fundamental discussion mainly on the analysis of radiation patterns for each one of these parameterizations, and aiming to determine which is best suited for multiparameter inversion. We have developed a new parameterization in the scope of FWI, based on the concept of kinematically equivalent media, as originally proposed in other areas of seismic data analysis. Our analysis is also based on radiation patterns, as well as the relation between the perturbation of this set of parameters and perturbation in traveltime. The radiation pattern reveals that this parameterization combines some of the characteristics of parameterizations with one velocity and two Thomsen’s parameters and parameterizations using two velocities and one Thomsen’s parameter. The study of perturbation of traveltime with perturbation of model parameters shows that the new parameterization is less ambiguous when relating these quantities in comparison with other more commonly used parameterizations. We have concluded that our new parameterization is well-suited for inverting diving waves, which are of paramount importance to carry out practical FWI successfully. We have demonstrated that the new parameterization produces good inversion results with synthetic and real data examples. In the latter case of the real data example from the Central North Sea, the inverted models show good agreement with the geologic structures, leading to an improvement of the seismic image and flatness of the common image gathers.


2017 ◽  
Vol 6 (3) ◽  
pp. 141 ◽  
Author(s):  
Thiago A. N. De Andrade ◽  
Luz Milena Zea Fernandez ◽  
Frank Gomes-Silva ◽  
Gauss M. Cordeiro

We study a three-parameter model named the gamma generalized Pareto distribution. This distribution extends the generalized Pareto model, which has many applications in areas such as insurance, reliability, finance and many others. We derive some of its characterizations and mathematical properties including explicit expressions for the density and quantile functions, ordinary and incomplete moments, mean deviations, Bonferroni and Lorenz curves, generating function, R\'enyi entropy and order statistics. We discuss the estimation of the model parameters by maximum likelihood. A small Monte Carlo simulation study and two applications to real data are presented. We hope that this distribution may be useful for modeling survival and reliability data.


2020 ◽  
pp. 1-22
Author(s):  
Luis E. Nieto-Barajas ◽  
Rodrigo S. Targino

ABSTRACT We propose a stochastic model for claims reserving that captures dependence along development years within a single triangle. This dependence is based on a gamma process with a moving average form of order $p \ge 0$ which is achieved through the use of poisson latent variables. We carry out Bayesian inference on model parameters and borrow strength across several triangles, coming from different lines of businesses or companies, through the use of hierarchical priors. We carry out a simulation study as well as a real data analysis. Results show that reserve estimates, for the real data set studied, are more accurate with our gamma dependence model as compared to the benchmark over-dispersed poisson that assumes independence.


Sign in / Sign up

Export Citation Format

Share Document