Dropping Networks for Transfer Learning

04/23/2018
by   James O'Neill, et al.
0

In natural language understanding, many challenges require learning relationships between two sequences for various tasks such as similarity, relatedness, paraphrasing and question matching. Some of these challenges are inherently closer in nature, hence the knowledge acquired from one task to another is easier acquired and adapted. However, transferring all knowledge might be undesired and can lead to sub-optimal results due to negative transfer. Hence, this paper focuses on the transferability of both instances and parameters across natural language understanding tasks using an ensemble-based transfer learning method to circumvent such issues. The primary contribution of this paper is the combination of both Dropout and Bagging for improved transferability in neural networks, referred to as Dropping herein. Secondly, we present a straightforward yet novel approach to incorporating source Dropping Networks to a target task for few-shot learning that mitigates negative transfer. This is achieved by using a decaying parameter chosen according to the slope changes of a smoothed spline error curve at sub-intervals during training. We compare the approach over the hard parameter sharing, soft parameter sharing and single-task learning to compare its effectiveness. The aforementioned adjustment leads to improved transfer learning performance and comparable results to the current state of the art only using few instances from the target task.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset