Global Hierarchical Neural Networks using Hierarchical Softmax

08/02/2023
by   Jetze Schuurmans, et al.
0

This paper presents a framework in which hierarchical softmax is used to create a global hierarchical classifier. The approach is applicable for any classification task where there is a natural hierarchy among classes. We show empirical results on four text classification datasets. In all datasets the hierarchical softmax improved on the regular softmax used in a flat classifier in terms of macro-F1 and macro-recall. In three out of four datasets hierarchical softmax achieved a higher micro-accuracy and macro-precision.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset