Gradient Descent Converges to Minimizers
We show that gradient descent converges to a local minimizer, almost surely with random initialization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.
READ FULL TEXTWe show that gradient descent converges to a local minimizer, almost surely with random initialization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.
READ FULL TEXTUse your Google Account to sign in to DeepAI