Which *BERT? A Survey Organizing Contextualized Encoders

10/02/2020
by   Patrick Xia, et al.
0

Pretrained contextualized text encoders are now a staple of the NLP community. We present a survey on language representation learning with the aim of consolidating a series of shared lessons learned across a variety of recent efforts. While significant advancements continue at a rapid pace, we find that enough has now been discovered, in different directions, that we can begin to organize advances according to common themes. Through this organization, we highlight important considerations when interpreting recent contributions and choosing which model to use.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset