An entropy functional bounded from above by one

04/20/2022
by   John Çamkıran, et al.
0

Shannon entropy is widely used for quantifying uncertainty in discrete random variables. But when normalized to the unit interval, as is often done in practice, it fails to convey the alphabet size of the random variable under study. This work introduces an entropy functional based on Jensen-Shannon divergence that is naturally bounded from above by one. Unlike normalized Shannon entropy, this new functional is strictly increasing in alphabet size under uniformity and is thus well suited to the characterization of discrete random variables.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset