Semi-orthogonal multilinear PCA with relaxed start

Abstract

Principal component analysis (PCA) is an unsupervised method for learning low-dimensional features with orthogonal projections. Multilinear PCA methods extend PCA to deal with multidimensional data (tensors) directly via tensor-to-tensor projection or tensor-to-vector projection (TVP). However, under the TVP setting, it is difficult to develop an effective multilinear PCA method with the orthogonality constraint. This paper tackles this problem by proposing a novel Semi-Orthogonal Multilinear PCA (SO-MPCA) approach. SO-MPCA learns low-dimensional features directly from tensors via TVP by imposing the orthogonality constraint in only one mode. This formulation results in more captured variance and more learned features than full orthogonality. For better generalization, we further introduce a relaxed start (RS) strategy to get SO-MPCA-RS by fixing the starting projection vectors, which increases the bias and reduces the variance of the learning model. Experiments on both face (2D) and gait (3D) data demonstrate that SO-MPCA-RS outperforms other competing algorithms on the whole, and the relaxed start strategy is also effective for other TVP-based PCA methods.

Publication
International Joint Conference on Artificial Intelligence (IJCAI)
Qiquan Shi
Qiquan Shi
PhD Student (now an AI Researcher at Huawei)
Haiping Lu
Haiping Lu
Director of the UK Open Multimodal AI Network, Professor of Machine Learning, and Head of AI Research Engineering

I am a Professor of Machine Learning. I develop translational multimodal AI technologies for advancing healthcare and scientific discovery.