Remote Source Coding under Gaussian Noise : Dueling Roles of Power and Entropy Power

05/16/2018
by   Krishnan Eswaran, et al.
0

The distributed remote source coding (so-called CEO) problem is studied in the case where the underlying source has finite differential entropy and the observation noise is Gaussian. The main result is a new lower bound for the sum-rate-distortion function under arbitrary distortion measures. When specialized to the case of mean-squared error, it is shown that the bound exactly mirrors a corresponding upper bound, except that the upper bound has the source power (variance) whereas the lower bound has the source entropy power. Bounds exhibiting this pleasing duality of power and entropy power have been well known for direct and centralized source coding since Shannon's work.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset