Monotonically Decreasing Sequence of Divergences
Divergences are quantities that measure discrepancy between two probability distributions and play an important role in various fields such as statistics and machine learning. Divergences are non-negative and are equal to 0 if and only if two distributions are the same. In addition, some important divergences such as the f-divergence have convexity, which we call "convex divergence". In this note, we focus on the convex divergences and we introduce integral and differential operators. For the convex divergences, the result applied the integral or differential operator is also a divergence. In particular, the integral operator preserves convexity. Furthermore, the results applied the integral operator multiple times constitute a monotonically decreasing sequence of the convex divergences. We derive new sequences of the convex divergences that include the Kullback-Leibler divergence or the reverse Kullback-Leibler divergence from these properties.
READ FULL TEXT