A note on L^1-Convergence of the Empiric Minimizer for unbounded functions with fast growth
For V : ℝ^d →ℝ coercive, we study the convergence rate for the L^1-distance of the empiric minimizer, which is the true minimum of the function V sampled with noise with a finite number n of samples, to the minimum of V. We show that in general, for unbounded functions with fast growth, the convergence rate is bounded above by a_n n^-1/q, where q is the dimension of the latent random variable and where a_n = o(n^ε) for every ε > 0. We then present applications to optimization problems arising in Machine Learning and in Monte Carlo simulation.
READ FULL TEXT