On the Mixing Times of Langevin Algorithms for Sampling under Isoperimetry

Andre Wibisono
Yale

Sampling is a fundamental algorithmic task with many applications and deep connections to optimization. In continuous time, the Langevin dynamics is a natural dynamics for sampling, and it corresponds to running gradient flow for minimizing relative entropy (KL divergence) in the space of probability distributions. Furthermore, the Langevin dynamics has fast convergence guarantees under natural isoperimetric assumptions, such as the Log-Sobolev inequality (LSI) or Poincaré inequality. In discrete time, deriving the iteration complexity of sampling algorithms is more challenging. In this talk, we will discuss a simple discretization, the Unadjusted Langevin Algorithm (ULA), and review its biased convergence guarantees under isoperimetry and smoothness. We also discuss the Proximal Sampling Algorithm, which has unbiased convergence guarantees under isoperimetry. We will discuss how to implement the Proximal Sampling Algorithm via rejection sampling under smoothness, which leads to the current best iteration complexity for sampling in discrete time under isoperimetry. This is based on joint work with Santosh Vempala, Sinho Chewi, Yongxin Chen, and Adil Salim.