Minimum entropy of a log-concave variable for fixed variance

09/04/2023
by   James Melbourne, et al.
0

We show that for log-concave real random variables with fixed variance the Shannon differential entropy is minimized for an exponential random variable. We apply this result to derive upper bounds on capacities of additive noise channels with log-concave noise. We also improve constants in the reverse entropy power inequalities for log-concave random variables.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset