Tsallis and Rényi deformations linked via a new λ-duality

07/26/2021
by   Ting-Kam Leonard Wong, et al.
0

Tsallis and Rényi entropies, which are monotone transformations of such other, generalize the classical Shannon entropy and the exponential family of probability distributions to non-extensive statistical physics, information theory, and statistics. The q-exponential family, as a deformed exponential family with subtractive normalization, nevertheless reflects the classical Legendre duality of convex functions as well as the associated concept of Bregman divergence. In this paper we show that a generalized λ-duality, where λ = 1 - q is the constant information-geometric curvature, induces a deformed exponential family with divisive normalization and links to Rényi entropy and optimal transport. Our λ-duality unifies the two deformation models, which differ by a mere reparameterization, and provides an elegant and deep framework to study the underlying mathematical structure. Using this duality, under which the Rényi entropy and divergence appear naturally, the λ-exponential family satisfies properties that parallel and generalize those of the exponential family. In particular, we give a new proof of the Tsallis entropy maximizing property of the q-exponential family. We also introduce a λ-mixture family which may be regared as the dual of the λ-exponential family. Finally, we discuss a duality between the λ-exponential family and the logarithmic divergence, and study its statistical consequences.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro