An Efficient Algorithm for Tensor Principal Component Analysis via Proximal Linearized Alternating Direction Method of Multipliers

Author(s):  
Linbo Qiao ◽  
Bofeng Zhang ◽  
Lei Zhuang ◽  
Jinshu Su
Author(s):  
Duo Wang ◽  
Toshihisa Tanaka

Kernel principal component analysis (KPCA) is a kernelized version of principal component analysis (PCA). A kernel principal component is a superposition of kernel functions. Due to the number of kernel functions equals the number of samples, each component is not a sparse representation. Our purpose is to sparsify coefficients expressing in linear combination of kernel functions, two types of sparse kernel principal component are proposed in this paper. The method for solving sparse problem comprises two steps: (a) we start with the Pythagorean theorem and derive an explicit regression expression of KPCA and (b) two types of regularization $l_1$-norm or $l_{2,1}$-norm are added into the regression expression in order to obtain two different sparsity form, respectively. As the proposed objective function is different from elastic net-based sparse PCA (SPCA), the SPCA method cannot be directly applied to the proposed cost function. We show that the sparse representations are obtained in its iterative optimization by conducting an alternating direction method of multipliers. Experiments on toy examples and real data confirm the performance and effectiveness of the proposed method.


2007 ◽  
Vol 17 (1) ◽  
pp. 59-64 ◽  
Author(s):  
Ning Sun ◽  
Hai-xian Wang ◽  
Zhen-hai Ji ◽  
Cai-rong Zou ◽  
Li Zhao

Sign in / Sign up

Export Citation Format

Share Document