On f-divergences between Cauchy distributions

01/29/2021
by   Frank Nielsen, et al.
0

We prove that the f-divergences between univariate Cauchy distributions are always symmetric and can be expressed as strictly increasing functions of the chi-squared divergence. We report the corresponding functions for the total variation distance, the Kullback-Leibler divergence, the LeCam-Vincze divergence, the squared Hellinger divergence, the Taneja divergence, and the Jensen-Shannon divergence. We then show that this symmetric f-divergence property does not hold anymore for multivariate Cauchy distributions. Finally, we present several metrizations of f-divergences between univariate Cauchy distributions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset