Superiority of GNN over NN in generalizing bandlimited functions

Martina Neuman
Michigan State

Graph Neural Network (GNN) with its ability to integrate graph information has been widely used for data analyses. However, the expressive power of GNN has only been studied for graph-level tasks but not for node-level tasks, such as node classification, where one tries to interpolate missing nodal labels from the observed ones. In this paper, we study the expressive power of GNN for the said classification task, which is in essence a function interpolation problem. Explicitly, we derive the number of weights and layers needed for a GNN to interpolate a band-limited function in R^d . Our result shows that, the number of weights needed to epsilon-approximate a bandlimited function using the GNN architecture is much fewer than the best known one using a fully connected neural network (NN) - in particular, one only needs O((log(epsilon^{-1}))^d) weights using a GNN trained by O((log(epsilon^{-1}))^d) samples to epsilon-approximate a discretized bandlimited signal in R^d. The result is obtained by drawing a connection between the GNN structure and the classical sampling theorems, making our work the first attempt in this direction.