Stochastic ADMM Frameworks for Resolving Structured Stochastic Convex Programs

Yue Xie
Seminar

We consider the program: min{E[f(x,xi)] + E[g(y,xi)] | Ax + By = b}, which finds application in regularized expected risk minimization and distributed regression. To exploit problem structure and allow for distributed computation, we propose a stochastic inexact ADMM (SI-ADMM) where subproblems are solved inexactly via stochastic approximation. A.s. convergence and geometric convergence rate of mean-squared error can be obtained for SI-ADMM.

Meanwhile, the related program min{E[f(x,xi)] + g(y) | Ax + By = b} where g(y) is deterministic but non-smooth is also studied. We propose variable sample-size stochastic ADMM (VSS-ADMM) and its accelerated variant (AVSS-ADMM). It is shown that the gap of convergence rate is closed between these stochastic frameworks and their deterministic counterparts. Furthermore, VSS-ADMM may recover the canonical oracle complexity.

Preliminary numerical experiments demonstrate some favorable properties of SI-ADMM and we observe that AVSS-ADMM is comparable to the state-of-art algorithm in addressing graph-guided fused Lasso.