On stochastic gradient Langevin dynamics with dependent data streams: the fully non-convex case
We consider the problem of sampling from a target distribution which is not necessarily logconcave. Non-asymptotic analysis results are established in a suitable Wasserstein-type distance of the Stochastic Gradient Langevin Dynamics (SGLD) algorithm, when the gradient is driven by even dependent data streams. Our estimates are sharper and uniform in the number of iterations, in contrast to those in previous studies.
READ FULL TEXT