For large optimization problems, limited memory compact quasi-Newton methods use low rank updates to effectively estimate the Hessian matrix of 2nd derivatives. However, when additional 2nd derivative information is available, it is desirable to exploit the given information. This presentation describes the compact representation of two "structured" BFGS quasi-Newton update formulas, which combine available Hessian information with quasi-Newton updates. The compact representations enable effective structured limited memory techniques and the computation of search directions using the Sherman-Morrison-Woodbury inverse. Implementations of 2 limited memory structured BFG algorithms are compared on a set of benchmark (CUTEst) problems, displaying desirable improvements.
Please use this link to attend the virtual seminar: