The Polynomial Method is Universal for Distribution-Free Correlational SQ Learning
We consider the problem of distribution-free learning for Boolean function classes in the PAC and agnostic models. Generalizing a recent beautiful work of Malach and Shalev-Shwartz (2020) who gave the first tight correlational SQ (CSQ) lower bounds for learning DNF formulas, we show that lower bounds on the threshold or approximate degree of any function class directly imply CSQ lower bounds for PAC or agnostic learning respectively. These match corresponding positive results using upper bounds on the threshold or approximate degree in the SQ model for PAC or agnostic learning. Many of these results were implicit in earlier works of Feldman and Sherstov.
READ FULL TEXT