Cross-Architectural Positive Pairs improve the effectiveness of Self-Supervised Learning

01/27/2023
by   Pranav Singh, et al.
0

Existing self-supervised techniques have extreme computational requirements and suffer a substantial drop in performance with a reduction in batch size or pretraining epochs. This paper presents Cross Architectural - Self Supervision (CASS), a novel self-supervised learning approach that leverages Transformer and CNN simultaneously. Compared to the existing state-of-the-art self-supervised learning approaches, we empirically show that CASS-trained CNNs and Transformers across four diverse datasets gained an average of 3.8 labeled data, 5.9 while taking 69 changes in batch size and training epochs than existing state-of-the-art self-supervised learning approaches. We have open-sourced our code at https://github.com/pranavsinghps1/CASS.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset