Covert Communication over Two Types of Additive Noise Channels
We extend previous results on covert communication over the additive white Gaussian noise channel to two other types of additive noise channels. The first is the Gaussian channel with memory, where the noise sequence is a Gaussian vector with an arbitrary invertible covariance matrix. We show that the fundamental limit for covert communication over such a channel is the same as over the channel with white, i.e., memoryless, Gaussian noise. The second type of channel we consider is one with memoryless generalized Gaussian noise. For such a channel we prove a general upper bound on the dominant term in the maximum number of nats that can be covertly communicated over n channel uses. When the shape parameter p of the generalized Gaussian noise distribution is in the interval (0, 1], we also prove a matching lower bound.
READ FULL TEXT