scholarly journals From Feature Space to Primal Space: KPCA and Its Mixture Model

Author(s):  
Haixian Wang
Energies ◽  
2020 ◽  
Vol 13 (18) ◽  
pp. 4901
Author(s):  
Zhenyu He ◽  
Xiaochen Zhang ◽  
Chao Liu ◽  
Te Han

The fault prognostics of the photovoltaic (PV) power generation system is expected to be a significant challenge as more and more PV systems with increasingly large capacities continue to come into existence. The PV inverter is the core component of the PV system, and it is essential to develop approaches that accurately predict the occurrence of inverter faults to ensure the PV system’s safety. This paper proposes a fault prognostics method which makes full use of the similarities between inverter clusters. First, a feature space was constructed using the t-distributed stochastic neighbor embedding (t-SNE) algorithm. Then, the fast clustering algorithm was used to search the center inverter of each sampling time from the feature space. The status of the center inverter was adopted to establish the health baseline. Finally, the Gaussian mixture model was established with two data clusters based on the central inverter and the inverter to be predicted. The divergence of the two clusters could be used to predict the inverter’s fault. The performance of the proposed method was evaluated with real PV monitoring data. The experimental results showed that the proposed method successfully predicted the occurrence of an inverter fault 3 months in advance.


2011 ◽  
Vol 121-126 ◽  
pp. 1151-1155
Author(s):  
Zhi Yuan Chen ◽  
Gang Luo ◽  
Zhi Gen Fei

The image segmentation technology has been extensively applied in many fields. As the foundation of image identification, the effective image segmentation plays a significant role during the course of subsequent image processing. Many theories and methods have been presented and discussed about image segmentation, such as K-means and fuzzy C-means methods, method based on regions information, method based on image edge detection, etc. In this work, it is proposed to apply Bayesian decision-making theory based on minimum error probability to gray image segmentation. The approach to image segmentation can guarantee the segmentation error probability minimum, which is generally what we desire. On the assumption that the gray values accord with the probability distribution of Gaussian finite mixture model in image feature space, EM algorithm is used to estimate the parameters of mixture model. In order to improve the convergence speed of EM algorithm, a novel method called weighted equal interval sampling is presented to obtain the contracted sample set. Consequently, the computation burden of EM algorithm is greatly reduced. The final experiments demonstrate the feasibility and high effectiveness of the method.


Author(s):  
Qingxiu Guo ◽  
Jianchang Liu ◽  
Shubin Tan ◽  
Dongsheng Yang ◽  
Yuan Li ◽  
...  

For multimode process monitoring, accurate mode information is difficult to be obtained, and each mode is monitored separately, which increases the complexity of the system. This paper proposes a multimode process monitoring strategy via improved variational inference Gaussian mixture model based on locality preserving projections (IVIGMM-LPP). First, the raw data are projected to the feature space where samples still maintain the original neighbor structure. Second, a new discriminant condition is introduced to reduce the influence of the initial category parameter on the iteration results in the VIGMM model. Then, the data are updated utilizing modal information, so that the scales of different modes are adjusted to the same level. Next, the deviation vector is introduced to eliminate the multi-center structure of data. Finally, the statistic is built to monitor the process. IVIGMM-LPP establishes one model for monitoring the premise of knowing the mode information, which reduces the complexity of the monitoring process and improves the fault detection rate. The experimental results of a numerical case and the Tennessee Eastman (TE) process verify the effectiveness of IVIGMM-LPP.


2013 ◽  
Vol 6 (1) ◽  
pp. 287-301
Author(s):  
Charles Savarimuthu ◽  
Arockiam. L

Feature Reduction is a kind of dimensionality reduction of feature space. There are a number of approaches are used to identify the significant features but they are not using the weighing approach. The weighing approach is quite useful for obtaining the significant features and removing the insignificant and irrelevant features using OWA formulation. The aim of this approach is to obtain the significant features and removing insignificant features by using the pairwise approach. This approach is helpful to find the weights of pairwise features at the same time, which leads to remove the insignificant features from the feature space using OWA. The significance of the OWA formulation is that, the paired features are identified in priori and their sum of weights are equal to 1. OWA criterion is introduced to obtain the significant features that are useful for predicting the accuracy of the cluster in GMM.


2014 ◽  
Vol 136 (6) ◽  
Author(s):  
Sheng Hong ◽  
Baoqing Wang ◽  
Guoqi Li ◽  
Qian Hong

This paper proposes a novel performance degradation assessment method for bearing based on ensemble empirical mode decomposition (EEMD), and Gaussian mixture model (GMM). EEMD is applied to preprocess the nonstationary vibration signals and get the feature space. GMM is utilized to approximate the density distribution of the lower-dimensional feature space processed by principal component analysis (PCA). The confidence value (CV) is calculated based on the overlap between the distribution of the baseline feature space and that of the testing feature space to indicate the performance of the bearing. The experiment results demonstrate the effectiveness of the proposed method.


VASA ◽  
2008 ◽  
Vol 37 (Supplement 73) ◽  
pp. 26-32 ◽  
Author(s):  
Schlattmann ◽  
Höhne ◽  
Plümper ◽  
Heidrich

Background: In order to analyze the prevalence of Raynaud’s syndrome in diseases such as scleroderma and Sjögren’s syndrom – a meta-analysis of published data was performed. Methods: The PubMed data base of the National Library of Medicine was used for studies dealing with Raynaud’s syndrome and scleroderma or Raynaud’s syndroem and Sjögren’s syndrom respectively. The studies found provided data sufficient to estimate the prevalence of Raynaud’s syndrome. The statistical analysis was based on methods for a fixed effects meta-analysis and finite mixture model for proportions. Results: For scleroderma a pooled prevalence of 80.9% and 95% CI (0.78, 0.83) was obtained. A mixture model analysis found four latent classes. We identified a class with a very low prevalence of 11%, weighted with 0.15. On the other hand there is a class with a very high prevalence of 96%. Analysing the association with Sjögren’s syndrome, the pooled analysis leads to a prevalence of Raynaud’s syndrome of 32%, 95% CI(26.7%, 37.7%). A mixture model finds a solution with two latent classes. Here, 38% of the studies show a prevalence of 18.8% whereas 62% observe a prevalence of 38.3%. Conclusion: There is strong variability of studies reporting the prevalence of Raynaud’s syndrome in patients suffering from scleroderma or Sjögren’s syndrome. The available data are insufficient to perform a proper quantitative analysis of the association of Raynaud’s phenomenon with scleroderma or Sjögren’s syndrome. Properly planned and reported epidemiological studies are needed in order to perform a thorough quantitative analysis of risk factors for Raynaud’s syndrome.


2012 ◽  
Author(s):  
Tom Busey ◽  
Chen Yu ◽  
Francisco Parada ◽  
Brandi Emerick ◽  
John Vanderkolk

Sign in / Sign up

Export Citation Format

Share Document