A New Lower Bound for Kullback-Leibler Divergence Based on Hammersley-Chapman-Robbins Bound

06/29/2019
by   Tomohiro Nishiyama, et al.
0

In this paper, we derive a useful lower bound for the Kullback-Leibler divergence (KL-divergence) based on the Hammersley-Chapman-Robbins bound (HCRB). The HCRB states that the variance of an estimator is bounded from below by the Chi-square divergence and the expectation value of the estimator. By using the relation between the KL-divergence and the Chi-square divergence, we show that the lower bound for the KL-divergence which only depends on the expectation value and the variance of a function we choose. We show that the equality holds for the Bernoulli distributions and show that the inequality converges to the Cramér-Rao bound when two distributions are very close. Furthermore, we describe application examples and examples of numerical calculation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset