Comparison of CP and Tucker tensor decomposition algorithms

Author(s):  
Elizabeth Hale ◽  
Ashley Prater-Bennette
2020 ◽  
Vol 5 (2) ◽  
pp. 13-32
Author(s):  
Hye-Kyung Yang ◽  
Hwan-Seung Yong

AbstractPurposeWe propose InParTen2, a multi-aspect parallel factor analysis three-dimensional tensor decomposition algorithm based on the Apache Spark framework. The proposed method reduces re-decomposition cost and can handle large tensors.Design/methodology/approachConsidering that tensor addition increases the size of a given tensor along all axes, the proposed method decomposes incoming tensors using existing decomposition results without generating sub-tensors. Additionally, InParTen2 avoids the calculation of Khari–Rao products and minimizes shuffling by using the Apache Spark platform.FindingsThe performance of InParTen2 is evaluated by comparing its execution time and accuracy with those of existing distributed tensor decomposition methods on various datasets. The results confirm that InParTen2 can process large tensors and reduce the re-calculation cost of tensor decomposition. Consequently, the proposed method is faster than existing tensor decomposition algorithms and can significantly reduce re-decomposition cost.Research limitationsThere are several Hadoop-based distributed tensor decomposition algorithms as well as MATLAB-based decomposition methods. However, the former require longer iteration time, and therefore their execution time cannot be compared with that of Spark-based algorithms, whereas the latter run on a single machine, thus limiting their ability to handle large data.Practical implicationsThe proposed algorithm can reduce re-decomposition cost when tensors are added to a given tensor by decomposing them based on existing decomposition results without re-decomposing the entire tensor.Originality/valueThe proposed method can handle large tensors and is fast within the limited-memory framework of Apache Spark. Moreover, InParTen2 can handle static as well as incremental tensor decomposition.


Author(s):  
Le Trung Thanh ◽  
Viet-Dung Nguyen ◽  
Nguyen Linh-Trung ◽  
Karim Abed-Meraim

Tensor decomposition has recently become a popular method of multi-dimensional data analysis in various applications. The main interest in tensor decomposition is for dimensionality reduction, approximation or subspace purposes. However, the emergence of “big data” now gives rise to increased computational complexity for performing tensor decomposition. In this paper, motivated by the advantages of the generalized minimum noise subspace (GMNS) method, recently proposed for array processing, we proposed two algorithms for principal subspace analysis (PSA) and two algorithms for tensor decomposition using parallel factor analysis (PARAFAC) and higher-order singular value decomposition (HOSVD). The proposed decomposition algorithms can preserve several desired properties of PARAFAC and HOSVD while substantially reducing the computational complexity. Performance comparisons of PSA and tensor decomposition of our proposed algorithms against the state-of-the-art ones were studied via numerical experiments. Experimental results indicated that the proposed algorithms are of practical values.


2020 ◽  
pp. 10-16
Author(s):  
Efraín PADILLA-ZEPEDA ◽  
Deni TORRES-ROMAN ◽  
Andrés MENDEZ-VAZQUEZ

Given the improvement of Remote Sensing (RS) sensors, it has been possible to increase spatial and spectral resolution on many of them. Nevertheless, the amount of data to represent and post-process has become highly prohibitive. Therefore, the need to be able to process such huge data sets, and one of the possible ways to deal with problems is the use of compression methods, however, data loss happen if the need of data size reduction is a must. RS spectral imagery contain high quantities of redundant information along the spectral domain, thus, making possible to use compression methods effectively as for example, tensor decomposition algorithms. In Tucker decomposition (TKD), an interesting and strange phenomenon happens when spatial domain is maintained and spectral domain is reduced, as a preprocessing step of a semantic segmentation task. Under these conditions, it is possible to observe an improvement on Pixel Accuracy (PA) metric when it is compared with the same uncompressed spectral image. Therefore, this work presents a study on how noise affects the Tucker Decomposition compared with Principal Component Analysis (PCA) and its impact in semantic segmentation.


2020 ◽  
Vol 10 (3) ◽  
pp. 797 ◽  
Author(s):  
Rafał Zdunek ◽  
Tomasz Sadowski

The issue of image completion has been developed considerably over the last two decades, and many computational strategies have been proposed to fill-in missing regions in an incomplete image. When the incomplete image contains many small-sized irregular missing areas, a good alternative seems to be the matrix or tensor decomposition algorithms that yield low-rank approximations. However, this approach uses heuristic rank adaptation techniques, especially for images with many details. To tackle the obstacles of low-rank completion methods, we propose to model the incomplete images with overlapping blocks of Tucker decomposition representations where the factor matrices are determined by a hybrid version of the Gaussian radial basis function and polynomial interpolation. The experiments, carried out for various image completion and resolution up-scaling problems, demonstrate that our approach considerably outperforms the baseline and state-of-the-art low-rank completion methods.


Sign in / Sign up

Export Citation Format

Share Document