YOUR NEIGHBORS ARE COMMUNICATING: TOWARDS POWERFUL AND SCALABLE GRAPH NEURAL NET-WORKS

Abstract

Message passing graph neural networks (GNNs) are known to have their expressiveness upper-bounded by 1-dimensional Weisfeiler-Lehman (1-WL) algorithm. To achieve more powerful GNNs, existing attempts either require ad hoc features, or involve operations that incur high time and space complexities. In this work, we propose a general and provably powerful GNN framework that preserves the scalability of message passing scheme. In particular, we first propose to empower 1-WL for graph isomorphism test by considering edges among neighbors, giving rise to NC-1-WL. The expressiveness of NC-1-WL is shown to be strictly above 1-WL and below 3-WL theoretically. Further, we propose the NC-GNN framework as a differentiable neural version of NC-1-WL. Our simple implementation of NC-GNN is provably as powerful as NC-1-WL. Experiments demonstrate that our NC-GNN achieves remarkable performance on various benchmarks.

1. INTRODUCTION

Graph Neural Networks (GNNs) (Gori et al., 2005; Scarselli et al., 2008) have been demonstrated to be effective for various graph tasks. In general, modern GNNs employ a message passing mechanism where the representation of each node is recursively updated by aggregating representations from its neighbors (Atwood & Towsley, 2016; Li et al., 2016; Kipf & Welling, 2017; Hamilton et al., 2017; Veličković et al., 2018; Xu et al., 2019; Gilmer et al., 2017) . Such message passing GNNs, however, have been shown to be at most as powerful as the 1-dimensional Weisfeiler-Lehman (1-WL) algorithm (Weisfeiler & Lehman, 1968 ) in distinguishing non-isomorphic graphs (Xu et al., 2019; Morris et al., 2019) . Thus, message passing GNNs cannot distinguish some simple graphs and cannot detect certain important structural concepts (Chen et al., 2020; Arvind et al., 2020) . Recently, a lot of efforts have been made to improve the expressiveness of message passing GNNs by considering high-dimensional WL algorithms (e.g., Morris et al. ( 2019 2022)). As thoroughly discussed in Section 5, these existing methods either rely on handcrafted/predefined/domain-specific features, or require high computational cost and memory budget. In contrast, our goal in this work is to develop a general GNN framework with provably expressive power, while maintaining the scalability of the message passing scheme. Specifically, we first propose an extension of the 1-WL algorithm, namely NC-1-WL, by considering the edges among neighbors. In other words, we incorporate the information of which two neighbors are communicating (i.e., connected) into the graph isomorphism test algorithm. To achieve this, we mathematically model the edges among neighbors as a multiset of multisets, in which each edge is represented as a multiset of two elements. We theoretically show that the expressiveness of our NC-1-WL in distinguishing non-isomorphic graphs is stricly above 1-WL and below 3-WL. Further, based on NC-1-WL, we propose a general GNN framework, known as NC-GNN, which can be considered as a differentiable neural version of NC-1-WL. We provide a simple implementation of NC-GNN that is proved to be as powerful as NC-1-WL. Compared to existing expressive GNNs, our NC-GNN is a general, provably powerful and, more importantly, scalable framework.



); Maron et al. (2019)), exploiting subgraph information (e.g., Bodnar et al. (2021a); Zhang & Li (2021)), or adding more distinguishable features (e.g., Murphy et al. (2019); Bouritsas et al. (

