Multi-Complementary and Unlabeled Learning for Arbitrary Losses and Models
A weakly-supervised learning framework named as complementary-label learning has been proposed recently, where each sample is equipped with a complementary label that denotes one of the classes the sample does not belong to. However, the previous complementary-label learning methods can not learn from samples with arbitrary number of complementary labels, which are more informative. The unlabeled samples are also neglected in the complementary-label learning setting. This paper gives the multi-complementary and unlabeled learning framework with two unbiased estimators of the classification risk for arbitrary losses and models. We first give an unbiased estimator of classification risk with multi-complementarily labeled samples and further incorporate unlabeled samples into the risk formulation. The estimation error bounds show that the proposed methods are in the optimal parametric convergence rate. Finally, the experiments on both linear and deep models show the effectiveness of our methods.
READ FULL TEXT