Understanding computational complexity is a long-standing pursuit in scientific computing. There are two general questions: (1) how can we find an efficient algorithm? (2) is the algorithm we find the best one? In this talk, we will discuss some efforts to understand both questions. Firstly, we will discuss solving underdamped Langevin dynamics, widely used in sampling Gibbs distributions, and prove that a randomized midpoint algorithm (proposed by Shen and Lee [NIPS 2019]) is order optimal in the context of information complexity (which relates to the second question). Secondly, equilibrium sampling methods are known to have metastability issues, and non-equilibrium sampling algorithms are thus proposed to alleviate this challenge. We will discuss a recent non-equilibrium importance sampling (NEIS) algorithm (proposed by Rotskoff and Vanden-Eijnden [PRL 2019]) utilizing ODE flows to guide the base distribution towards the target distribution. Empirically, we use machine learning techniques to learn better ODE flows and show that NEIS could potentially help to reduce the query complexity compared with traditional Annealed Importance Sampling (which relates to the first question above).
- Yu Cao, Jianfeng Lu, and Lihan Wang. 2021. "Complexity of Randomized Algorithms for Underdamped Langevin Dynamics." Communications in Mathematical Sciences 19 (7): 1827–53.
- Yu Cao, and Eric Vanden-Eijnden. 2022. "Learning Optimal Flows for Non-Equilibrium Importance Sampling." arXiv. http://arxiv.org/abs/2206.09908. To appear in NeurIPS 2022.