Edge Contraction Pooling for Graph Neural Networks

05/27/2019
by   Frederik Diehl, et al.
0

Graph Neural Network (GNN) research has concentrated on improving convolutional layers, with little attention paid to developing graph pooling layers. Yet pooling layers can enable GNNs to reason over abstracted groups of nodes instead of single nodes. To close this gap, we propose a graph pooling layer relying on the notion of edge contraction: EdgePool learns a localized and sparse hard pooling transform. We show that EdgePool outperforms alternative pooling methods, can be easily integrated into most GNN models, and improves performance on both node and graph classification.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset