GRESNET: Graph Residuals for Reviving Deep Graph Neural Nets from Suspended Animation
In this paper, we will investigate the causes of the GNNs' "suspended animation problem", and analyze if such a problem also exists in all other GNN models or not. GNNs are very different from the traditional deep learning models, and the existing solutions to resolve such problems, e.g., residual terms used in ResNet for CNN, cannot work well for GNNs actually. In this paper, several different novel graph residual terms will be studied for GNNs specially. Equipped with the new graph residual blocks, we will further introduce a new graph neural network architecture, namely graph residual neural network (GRESNET), to resolve the observed problem. Instead of merely stacking the spectral graph convolution layers on each other, GRESNET creates a high-way to allow the raw features of the nodes to be fed into the graph convolution operators in each layer of the model. We will study the effectiveness of the GRESNET architecture and those different graph residuals for several existing vanilla GNNs. In addition, theoretic analyses on GRESNET will be provided in this paper as well to demonstrate its effectiveness from the norm-preservation perspective.
READ FULL TEXT