Improved Analysis of Score-based Generative Modeling: User-Friendly Bounds under Minimal Smoothness Assumptions

11/03/2022
by   Hongrui Chen, et al.
0

In this paper, we focus on the theoretical analysis of diffusion-based generative modeling. Under an L^2-accurate score estimator, we provide convergence guarantees with polynomial complexity for any data distribution with second-order moment, by either employing an early stopping technique or assuming smoothness condition on the score function of the data distribution. Our result does not rely on any log-concavity or functional inequality assumption and has a logarithmic dependence on the smoothness. In particular, we show that under only a finite second moment condition, approximating the following in KL divergence in ϵ-accuracy can be done in Õ(d^2 log^2 (1/δ)/ϵ^2) steps: 1) the variance-δ Gaussian perturbation of any data distribution; 2) data distributions with 1/δ-smooth score functions. Our theoretical analysis also provides quantitative comparison between different discrete approximations and may guide the choice of discretization points in practice.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset