A class of learning algorithms for principal component analysis and minor component analysis

2000 ◽  
Vol 11 (2) ◽  
pp. 529-533 ◽  
Author(s):  
Q. Zhang ◽  
Yiu-Wung Leung
2002 ◽  
Vol 14 (5) ◽  
pp. 1169-1182 ◽  
Author(s):  
C. K. I. Williams ◽  
F. V. Agakov

Recently, Hinton introduced the products of experts architecture for density estimation, where individual expert probabilities are multiplied and renormalized. We consider products of gaussian “pancakes” equally elongated in all directions except one and prove that the maximum likelihood solution for the model gives rise to a minor component analysis solution. We also discuss the covariance structure of sums and products of gaussian pancakes or one-factor probabilistic principal component analysis models.


2020 ◽  
Vol 32 (10) ◽  
pp. 1901-1935
Author(s):  
Keishi Sando ◽  
Hideitsu Hino

Principal component analysis (PCA) is a widely used method for data processing, such as for dimension reduction and visualization. Standard PCA is known to be sensitive to outliers, and various robust PCA methods have been proposed. It has been shown that the robustness of many statistical methods can be improved using mode estimation instead of mean estimation, because mode estimation is not significantly affected by the presence of outliers. Thus, this study proposes a modal principal component analysis (MPCA), which is a robust PCA method based on mode estimation. The proposed method finds the minor component by estimating the mode of the projected data points. As a theoretical contribution, probabilistic convergence property, influence function, finite-sample breakdown point, and its lower bound for the proposed MPCA are derived. The experimental results show that the proposed method has advantages over conventional methods.


Sign in / Sign up

Export Citation Format

Share Document