Directed Data-Processing Inequalities for Systems with Feedback
We present novel data-processing inequalities relating the mutual information and the directed information in systems with feedback. The internal blocks within such systems are restricted only to be causal mappings, but are allowed to be non-linear, stochastic and time varying. These blocks can for example represent source encoders, decoders or even communication channels. Moreover, the involved signals can be arbitrarily distributed. Our first main result relates mutual and directed informations and can be interpreted as a law of conservation of information flow. Our second main result is a pair of data-processing inequalities (one the conditional version of the other) between nested pairs of random sequences entirely within the closed loop. Our third main result is introducing and characterizing the notion of in-the-loop (ITL) transmission rate for channel coding scenarios in which the messages are internal to the loop. Interestingly, in this case the conventional notions of transmission rate associated with the entropy of the messages and of channel capacity based on maximizing the mutual information between the messages and the output turn out to be inadequate. Instead, as we show, the ITL transmission rate is the unique notion of rate for which a channel code attains zero error probability if and only if such ITL rate does not exceed the corresponding directed information rate from messages to decoded messages. We apply our data-processing inequalities to show that the supremum of achievable (in the usual channel coding sense) ITL transmission rates is upper bounded by the supremum of the directed information rate across the communication channel. Moreover, we present an example in which this upper bound is attained. Finally, ...
READ FULL TEXT