Alexander and Hermine Avanessians Professor
of Industrial Engineering and Operations Research
Dept. of Industrial Engg. and Operations Research
S.W. Mudd Building
500 West 120th Street
New York 10027-6699
United States of Ameria
Abstract: Recovering a low-rank matrix or tensor from incomplete or corrupted observations is a recurring problem in signal processing and machine learning. To exploit the structure of data that is intrinsically more than three-dimensional, convex models such low-rank completion and robust principal component analysis (RPCA) for matrices have been extended to tensors. In this talk, we rigorously establish recovery guarantees for both tensor completion and tensor RPCA. We demonstrate that using the most popular convex relaxation for the tensor Tucker rank can be substantially suboptimal in terms of the number of observations needed for exact recovery. We introduce a very simple, new convex relaxation which is shown be much better, both theoretically and empirically. Moreover, we propose algorithms to solve both low-rank matrix and tensor recovery models based on the Alternating Direction Augmented Lagrangian (ADAL), Frank-Wolfe and prox-gradient methods. Finally, we empirically investigate the recoverability properties of these convex models, and the computational performance of our algorithms on both simulated and real data (This is joint work with Cun Mu, Bo Huang and Tony Qin (IEOR PhD students at Columbia University) and John Wright (E.E. faculty member at Columbia University)).