Monotonically Decreasing Sequence of Divergences

10/01/2019
by   Tomohiro Nishiyama, et al.
0

Divergences are quantities that measure discrepancy between two probability distributions and play an important role in various fields such as statistics and machine learning. Divergences are non-negative and are equal to 0 if and only if two distributions are the same. In addition, some important divergences such as the f-divergence have convexity, which we call "convex divergence". In this note, we focus on the convex divergences and we introduce integral and differential operators. For the convex divergences, the result applied the integral or differential operator is also a divergence. In particular, the integral operator preserves convexity. Furthermore, the results applied the integral operator multiple times constitute a monotonically decreasing sequence of the convex divergences. We derive new sequences of the convex divergences that include the Kullback-Leibler divergence or the reverse Kullback-Leibler divergence from these properties.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset