Singular Value Decomposition - Tensor SVD

Tensor SVD

Unfortunately, the problem of finding a low rank approximation to a tensor is ill-posed. In other words, a best possible solution does not exist, but instead a sequence of better and better approximations that converge to infinitely large matrices. In spite of this, there are several ways to attempt decomposition. Two types of tensor decompositions exist, which generalise SVD to multi-way arrays. One of them decomposes a tensor into a sum of rank-1 tensors, see Candecomp-PARAFAC (CP) algorithm. The CP algorithm should not be confused with a rank-R decomposition but, for a given N, it decomposes a tensor into a sum of N rank-1 tensors that optimally fit the original tensor. The second type of decomposition computes the orthonormal subspaces associated with the different axes or modes of a tensor (orthonormal row space, column space, fiber space, etc.). This decomposition is referred to in the literature as the Tucker3/TuckerM, M-mode SVD, multilinear SVD and sometimes referred to as a higher-order SVD (HOSVD). In addition, multilinear principal component analysis in multilinear subspace learning involves the same mathematical operations as Tucker decomposition, being used in a different context of dimensionality reduction.

Read more about this topic:  Singular Value Decomposition