Learning from Small Data Through Sampling an Implicit Conditional Generative Latent Optimization Model

03/31/2020
by   Idan Azuri, et al.
0

We revisit the long-standing problem of learning from small sample. In recent years major efforts have been invested into the generation of new samples from a small set of training data points. Some use classical transformations, others synthesize new examples. Our approach belongs to the second one. We propose a new model based on conditional Generative Latent Optimization (cGLO). Our model learns to synthesize completely new samples for every class just by interpolating between samples in the latent space. The proposed method samples the learned latent space using spherical interpolations (slerp) and generates a new sample using the trained generator. Our empirical results show that the new sampled set is diverse enough, leading to improvement in image classification in comparison to the state of the art, when trained on small samples of CIFAR-100 and CUB-200.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset