Unifying tensor factorization and tensor nuclear norm approaches for low-rank tensor completion
Abstract: Low-rank tensor completion (LRTC) has gained significant attention due to its powerful capability of recovering missing entries. However, it has to repeatedly calculate the time-consuming singular value decomposition (SVD). To address this drawback, we, based on the tensor-tensor product (t-product), propose a new LRTC method-the unified tensor factorization (UTF)-for 3-way tensor completion. We first integrate the tensor factorization (TF) and the tensor nuclear norm (TNN) regularization into a framework that inherits the benefits of both TF and TNN: fast calculation and convex optimization. The conditions under which TF and TNN are equivalent are analyzed. Then, UTF for tensor completion is presented and an efficient iterative updated algorithm based on the alternate direction method of multipliers (ADMM) is used for our UTF optimization, and the solution of the proposed alternate minimization algorithm is also proven to be able to converge to a Karush–Kuhn–Tucker (KKT) point. Finally, numerical experiments on synthetic data completion and image/video inpainting tasks demonstrate the effectiveness of our method over other state-of-the-art tensor completion methods.
Citation:Du, S.; Xiao, Q.; Shi, Y.; Cucchiara, R.; Ma, Y. "Unifying tensor factorization and tensor nuclear norm approaches for low-rank tensor completion" NEUROCOMPUTING, vol. 458, pp. 204 -218 , 2021 DOI: 10.1016/j.neucom.2021.06.020