Joint Binary Neural Network for Multi-label Learning with Applications to Emotion Classification

02/03/2018
by   Huihui He, et al.
0

Recently the deep learning techniques have achieved success in multi-label classification due to its automatic representation learning ability and the end-to-end learning framework. Existing deep neural networks in multi-label classification can be divided into two kinds: binary relevance neural network (BRNN) and threshold dependent neural network (TDNN). However, the former needs to train a set of isolate binary networks which ignore dependencies between labels and have heavy computational load, while the latter needs an additional threshold function mechanism to transform the multi-class probabilities to multi-label outputs. In this paper, we propose a joint binary neural network (JBNN), to address these shortcomings. In JBNN, the representation of the text is fed to a set of logistic functions instead of a softmax function, and the multiple binary classifications are carried out synchronously in one neural network framework. Moreover, the relations between labels are captured via training on a joint binary cross entropy (JBCE) loss. To better meet multi-label emotion classification, we further proposed to incorporate the prior label relations into the JBCE loss. The experimental results on the benchmark dataset show that our model performs significantly better than the state-of-the-art multi-label emotion classification methods, in both classification performance and computational efficiency.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset