Improving Consistency-Based Semi-Supervised Learning with Weight Averaging

06/14/2018
by   Ben Athiwaratkun, et al.
0

Recent advances in deep unsupervised learning have renewed interest in semi-supervised methods, which can learn from both labeled and unlabeled data. Presently the most successful approaches to semi-supervised learning are based on consistency regularization, whereby a model is trained to be robust to small perturbations of its inputs and parameters. We show that consistency regularization leads to flatter but narrower optima. We also show that the test error surface for these methods is approximately convex in regions of weight space traversed by SGD. Inspired by these observations, we propose to train consistency based semi-supervised models with stochastic weight averaging (SWA), a recent method which averages weights along the trajectory of SGD. We also develop fast-SWA, which further accelerates convergence by averaging multiple points within each cycle of a cyclical learning rate schedule. With fast-SWA we achieve the best known semi-supervised results on CIFAR-10 and CIFAR-100 over many different numbers of observed training labels. For example, we achieve 5.0 the previous best result in the literature. We also improve the best known result from 80 Finally, we show that with fast-SWA the simple Π model becomes state-of-the-art for large labeled settings.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset