Applied Mathematics Seminar - Langevin Algorithms and Wasserstein Convergence of Score-based Generative Models
Langevin-based algorithms, derived from a certain class of stochastic differential equations, have emerged as powerful tools for both sampling in high-dimensional spaces and optimization in non-convex landscapes. Over the years, these methods have shown remarkable success in addressing the challenges posed by complex, high-dimensional distributions and the inherent difficulties of non-convex optimization problems. Their ability to efficiently explore complex, multi-modal distributions and escape local minima makes them particularly suited for a wide range of applications, from machine learning to physical simulations. Score-based Generative Models (SGMs) approximate a data distribution by perturbing it with Gaussian noise and subsequently denoising it via a learned reverse diffusion process. These models excel at modeling complex data distributions and generating diverse samples, achieving state-of-the-art performance across domains such as computer vision, audio generation and computational biology. In this talk, we highlight some recent progress in non-asymptotic Wasserstein-2 convergence guarantees for SGMs and for a certain class of Langevin samplers for non-convex potentials.