Technical report on Conversational Question Answering

09/24/2019
by   Ying Ju, et al.
0

Conversational Question Answering is a challenging task since it requires understanding of conversational history. In this project, we propose a new system RoBERTa + AT +KD, which involves rationale tagging multi-task, adversarial training, knowledge distillation and a linguistic post-process strategy. Our single model achieves 90.4(F1) on the CoQA test set without data augmentation, outperforming the current state-of-the-art single model by 2.6 F1.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset