Remote Source Coding under Gaussian Noise : Dueling Roles of Power and Entropy Power
The distributed remote source coding (so-called CEO) problem is studied in the case where the underlying source has finite differential entropy and the observation noise is Gaussian. The main result is a new lower bound for the sum-rate-distortion function under arbitrary distortion measures. When specialized to the case of mean-squared error, it is shown that the bound exactly mirrors a corresponding upper bound, except that the upper bound has the source power (variance) whereas the lower bound has the source entropy power. Bounds exhibiting this pleasing duality of power and entropy power have been well known for direct and centralized source coding since Shannon's work.
READ FULL TEXT