Symmetric Gauss-Seidel decomposition theorem
15 Oct 2015. NUS mathematicians discovered a symmetric Gauss-Seidel (sGS) decomposition theorem for solving multi-block convex composite quadratic programming whose objective also contains a nonsmooth term in the first block.
The sGS decomposition theorem can be incorporated into various optimization algorithms to efficiently solve large scale linearly constrained multi-block convex composite quadratic conic programming problems and beyond. In particular, this theorem allows them to successfully resolve two non-trivial issues in optimization: (a) design a highly efficient semi-proximal alternating direction method of multipliers (ADMM) for solving a large class of multi-block convex programming with convergent guarantee, and (b) design an inexact accelerated block coordinate descent (ABCD) method. Using these algorithms, a team led by Profs SUN Defeng and TOH Kim Chuan from the Department of Mathematics in NUS are able to solve various classes of large scale semidefinite programming problems whose sizes are unthinkable only a couple of years ago.
In many high-dimensional data analysis and statistical estimation problems, and beyond, one often needs to add regularized terms, such as those to promote sparsity and low rank structures, to the loss functions to tackle the fundamental issue of insufficient observations from a big system. This inevitably generates difficult composite optimization problems, in particular composite quadratic programming problems having smooth and non-smooth component functions in the objective, and with a huge number of linear equality and inequality constraints. Currently, the team has adapted the algorithmic advances they have achieved to design highly efficient algorithms to solve some of these problems.
The team’s research currently focuses on three main areas (1) to develop robust and efficient software packages for solving large scale semidefinite programming problems with bound constraints and other classes of conic programming problems; (2) to adapt the algorithmic advances they have achieved to design efficient specialized algorithms for solving various classes of convex composite conic programming problems arising from various application areas such as machine learning and statistical estimations; (3) to further derive the theoretical convergent properties of the algorithms and their extensions, as well as to design even more powerful algorithms for even larger classes of conic programming problems.
A figure shows the performance profile of the algorithms SDPNAL+ (blue solid line) and ADMM+ (red dash line) versus the state-of-the art solvers (black and magenta curves) [Image credit: Toh Kim Chuan].
1. Li XD, Sun DF, Toh KC. “A Schur complement based semi-proximal ADMM for convex quadratic conic programming and extensions” Mathematical Programming, in print.
2. Yang LQ, Sun DF, Toh KC. “SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints” Mathemtical Programming Computation 7 (2015) pp. 331.
3. Sun DF, Toh KC, and Yang LQ. “A convergent 3-block semi-proximal alternating direction method of multipliers for conic programming with 4-type constraints” SIAM J. Optimization 25 (2015) 882.