An Attentive Neural Architecture for Fine-grained Entity Type Classification

04/19/2016
by   Sonse Shimaoka, et al.
0

In this work we propose a novel attention-based neural network model for the task of fine-grained entity type classification that unlike previously proposed models recursively composes representations of entity mention contexts. Our model achieves state-of-the-art performance with 74.94 the well-established FIGER dataset, a relative improvement of 2.59 investigate the behavior of the attention mechanism of our model and observe that it can learn contextual linguistic expressions that indicate the fine-grained category memberships of an entity.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset