Towards a Non-Stochastic Information Theory

04/26/2019
by   Anshuka Rangi, et al.
0

The δ-mutual information between uncertain variables is introduced as a generalization of Nair's non-stochastic information functional. Several properties of this new quantity are illustrated, and used to prove a channel coding theorem in a non-stochastic setting. Namely, it is shown that the largest δ-mutual information between a metric space and its ϵ-packing equals the (ϵ, δ)-capacity of the space. This notion of capacity generalizes the Kolmogorov ϵ-capacity to packing sets of overlap at most δ, and is a variation of a previous definition proposed by one of the authors. These results provide a framework for developing a non-stochastic information theory motivated by potential applications in control and learning theories. Compared to previous non-stochastic approaches, the theory admits the possibility of decoding errors as in Shannon's probabilistic setting, while retaining its worst-case non-stochastic character.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset