Convergence rates for the stochastic gradient descent method for non-convex objective functions

04/02/2019
by   Benjamin Fehrman, et al.
0

We prove the local convergence to minima and estimates on the rate of convergence for the stochastic gradient descent method in the case of not necessarily globally convex nor contracting objective functions. In particular, the results are applicable to simple objective functions arising in machine learning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset