An axiomatic characterization of mutual information

08/28/2021
by   James Fullwood, et al.
0

We characterize mutual information as the unique map on ordered pairs of random variables satisfying a set of axioms similar to those of Faddeev's characterization of the Shannon entropy. There is a new axiom in our characterization however which has no analogue for Shannon entropy, based on the notion of a Markov triangle, which may be thought of as a composition of communication channels for which conditional entropy acts functorially. Our proofs are coordinate-free in the sense that no logarithms appear in our calculations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset