Structure Exploitation in Matrix Optimization Problems via Proximal Splitting Methods with Bregman Divergences

Xin Jiang, Lehigh University
MCS Seminar Graphic

We present first-order optimization algorithms that incorporate Bregman distances in proximal mappings. and showcase that the use of Bregman distances could exploit special structure in certain matrix optimization problems. The presented algorithms include several important methods as special cases if standard proximal mappings are used. Extensions to generalized Bregman distances are attractive if the complexity per iteration can be reduced by matching the Bregman distance to problem structure. As an example, we apply the proposed method to sparse semidefinite programs. With a suitable choice of Bregman distance, the complexity of evaluating the Bregman proximal operator is shown to be roughly proportional to the cost of a sparse Cholesky factorization. This is much cheaper than the standard proximal operator with Euclidean distances, which requires an eigenvalue decomposition. The proposed methods are also applied to a class of log-determinant optimization problems that exhibit a difference-of-convex structure. For these problems, the proposed methods are used as the subproblem solver of the classic, conceptual difference-of-convex algorithm (DCA), and with a suitable choice of Bregman distance, each iteration of the proposed method has a closed-form expression. This is the first efficient and scalable solver that is guaranteed to find the global optimum of the aforementioned type of nonconvex problems.

 To add to calendar:

Click on:

Enter your credentials.

Search for your seminar 

Click “Add to calendar”