Information Theoretic Bound on Optimal Worst-case Error in Binary Mixture Identification
Identification of latent binary sequences from a pool of noisy observations has a wide range of applications in both statistical learning and population genetics. Each observed sequence is the result of passing one of the latent mother-sequences through a binary symmetric channel, which makes this configuration analogous to a special case of Bernoulli Mixture Models. This paper aims to attain an asymptotically tight upper-bound on the error of Maximum Likelihood mixture identification in such problems. The obtained results demonstrate fundamental guarantees on the inference accuracy of the optimal estimator. To this end, we set out to find the closest pair of discrete distributions with respect to the Chernoff Information measure. We provide a novel technique to lower bound the Chernoff Information in an efficient way. We also show that a drastic phase transition occurs at noise level 0.25. Our findings reveal that the identification problem becomes much harder as the noise probability exceeds this threshold.
READ FULL TEXT