Convergence rates for the stochastic gradient descent method for non-convex objective functions
We prove the local convergence to minima and estimates on the rate of convergence for the stochastic gradient descent method in the case of not necessarily globally convex nor contracting objective functions. In particular, the results are applicable to simple objective functions arising in machine learning.
READ FULL TEXT