Uncertainty about Uncertainty: Near-Optimal Adaptive Algorithms for Estimating Binary Mixtures of Unknown Coins
Given a mixture between two populations of coins, "positive" coins that have (unknown and potentially different) probabilities of heads ≥1/2+Δ and negative coins with probabilities ≤1/2-Δ, we consider the task of estimating the fraction ρ of coins of each type to within additive error ϵ. We introduce new techniques to show a fully-adaptive lower bound of Ω(ρ/ϵ^2Δ^2) samples (for constant probability of success). We achieve almost-matching algorithmic performance of O(ρ/ϵ^2Δ^2(1+ρ1/ϵ)) samples, which matches the lower bound except in the regime where ρ=ω(1/ 1/ϵ). The fine-grained adaptive flavor of both our algorithm and lower bound contrasts with much previous work in distributional testing and learning.
READ FULL TEXT