Probabilistic rank-one tensor analysis with concurrent regularizations

Abstract

Subspace learning for tensors attracts increasing interest in recent years, leading to the development of multilinear extensions of principal component analysis (PCA) and probabilistic PCA (PPCA). Existing multilinear PPCAs are based on the Tucker or CANDECOMP/PARAFAC (CP) models. Although both kinds of multilinear PPCAs have shown their effectiveness in dealing with tensors, they also have their own limitations. Tucker-based multilinear PPCAs have a restrictive subspace representation and suffer from rotational ambiguity, while CP-based ones are more prone to overfitting. To address these problems, we propose probabilistic rank-one tensor analysis (PROTA), a CP-based multilinear PPCA. PROTA has a more flexible subspace representation than Tucker-based PPCAs, and avoids rotational ambiguity. To alleviate overfitting for CP-based PPCAs, we propose two simple and effective regularization strategies, named as concurrent regularizations (CRs). By adjusting the noise variance or the moments of latent features, our strategies concurrently and coherently penalize the entire subspace. This relaxes unnecessary scale restrictions and gains more flexibility in regularizing CP-based PPCAs. To take full advantage of the probabilistic framework, we further propose a Bayesian treatment of PROTA, which achieves both automatic feature determination and robustness against overfitting. Experiments on synthetic and real-world datasets demonstrate the superiority of PROTA in subspace estimation and classification, as well as the effectiveness of CRs in alleviating overfitting.

Publication
IEEE Transactions on Cybernetics
Yang Zhou
Yang Zhou
PhD Student (now Postdoc at National University of Singapore)
Haiping Lu
Haiping Lu
Professor of Machine Learning, Head of AI Research Engineering, and Turing Academic Lead

I am a Professor of Machine Learning. I develop translational AI technologies for better analysing multimodal data in healthcare and beyond.