Tata Institute of Fundamental Research

Convex Gaussian Min-Max Theorem for Precise Error Analysis of Regularized Regression Problems in High Dimensions

STCS Student Seminar
Speaker: Tirthankar Adhikari (TIFR)
Organiser: Soumyadeep Paul
Date: Friday, 10 Oct 2025, 16:00 to 17:00
Venue: A-201 (STCS Seminar Room)

(Scan to add to calendar)
Abstract: 

Convex regularized optimization methods such as the LASSO and Group-LASSO are central to modern signal processing and high-dimensional regression problems. In their influential work, Thrampoulidis, Oymak, and Hassibi introduced the Convex Gaussian Min-Max Theorem (CGMT), an elegant framework for precise asymptotic error analysis of a wide class of convex regression problems under Gaussian measurement models. Building on Gordon's Gaussian min-max theorem and ideas of Stojnic, they showed that a complicated Primary Optimization (PO) problem can be replaced by a simpler, analytically tractable Auxiliary Optimization (AO) problem whose solution precisely predicts key quantities such as the normalized squared error and optimal regularization parameters. Beyond its specific applications to linear and structured regression, the CGMT provides a powerful conceptual bridge between high-dimensional probability and convex optimization, enabling sharp asymptotic performance characterizations through the remarkably elegant PO to AO reduction.