An Inexact Projected Gradient Method with Rounding and Lifting by Nonlinear Programming for Solving Rank-One Semidefinite Relaxation of Polynomial Optimization

Ling Liang, National University of Singapore
Webinar
Shutterstock Graphic

We consider solving high-order SDP relaxations of nonconvex POPs that often admit degenerate rank-one solutions. Instead of solving the SDP alone, we propose a new framework that blends local search using the nonconvex POP into global descent using the convex SDP. In particular, we first design a globally convergent inexact projected gradient method (iPGM) for solving the SDP that serves as the backbone of the framework. Then, we accelerate the iPGM by taking long, but safeguarded, rank-one steps generated by fast nonlinear programming algorithms. We establish convergence properties of our framework. Numerically,  we conduct numerical experiments for solving large-scale second-order SDP relaxations arising from a diverse set of POPs. Our framework demonstrates promising practical performance.