Boost Picking: A Universal Method on Converting Supervised Classification to Semi-supervised Classification

02/18/2016
by   Fuqiang Liu, et al.
0

This paper proposes a universal method, Boost Picking, to train supervised classification models mainly by un-labeled data. Boost Picking only adopts two weak classifiers to estimate and correct the error. It is theoretically proved that Boost Picking could train a supervised model mainly by un-labeled data as effectively as the same model trained by 100 the two weak classifiers are all greater than zero and the sum of precisions is greater than one. Based on Boost Picking, we present "Test along with Training (TawT)" to improve the generalization of supervised models. Both Boost Picking and TawT are successfully tested in varied little data sets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset