Measuring Information from Moments
We investigate the problem of representing information measures in terms of the moments of the underlying random variables. First, we derive polynomial approximations of the conditional expectation operator. We then apply these approximations to bound the best mean-square error achieved by a polynomial estimator – referred to here as the PMMSE. In Gaussian channels, the PMMSE coincides with the minimum mean-square error (MMSE) if and only if the input is either Gaussian or constant, i.e., if and only if the conditional expectation of the input of the channel given the output is a polynomial of degree at most 1. By combining the PMMSE with the I-MMSE relationship, we derive new formulas for information measures (e.g., differential entropy, mutual information) that are given in terms of the moments of the underlying random variables. As an application, we introduce estimators for information measures from data via approximating the moments in our formulas by sample moments. These estimators are shown to be asymptotically consistent and possess desirable properties, e.g., invariance to affine transformations when used to estimate mutual information.
READ FULL TEXT