Learning with Semi-Definite Programming: new statistical bounds based on fixed point analysis and excess risk curvature

04/04/2020
by   Stéphane Chrétien, et al.
0

Many statistical learning problems have recently been shown to be amenable to Semi-Definite Programming (SDP), with community detection and clustering in Gaussian mixture models as the most striking instances [javanmard et al., 2016]. Given the growing range of applications of SDP-based techniques to machine learning problems, and the rapid progress in the design of efficient algorithms for solving SDPs, an intriguing question is to understand how the recent advances from empirical process theory can be put to work in order to provide a precise statistical analysis of SDP estimators. In the present paper, we borrow cutting edge techniques and concepts from the learning theory literature, such as fixed point equations and excess risk curvature arguments, which yield general estimation and prediction results for a wide class of SDP estimators. From this perspective, we revisit some classical results in community detection from [guédon et al.,2016] and [chen et al., 2016], and we obtain statistical guarantees for SDP estimators used in signed clustering, group synchronization and MAXCUT.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset