EQUIVARIANT HYPERGRAPH DIFFUSION NEURAL OP-ERATORS

Abstract

Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data and further solve relevant prediction tasks built upon such higher-order relations. However, higher-order relations in practice contain complex patterns and are often highly irregular. So, it is often challenging to design an HNN that suffices to express those relations while keeping computational efficiency. Inspired by hypergraph diffusion algorithms, this work proposes a new HNN architecture named ED-HNN, which provably approximates any continuous equivariant hypergraph diffusion operators that can model a wide range of higher-order relations. ED-HNN can be implemented efficiently by combining star expansions of hypergraphs with standard message passing neural networks. ED-HNN further shows great superiority in processing heterophilic hypergraphs and constructing deep models. We evaluate ED-HNN for node classification on nine real-world hypergraph datasets. ED-HNN uniformly outperforms the best baselines over these nine datasets and achieves more than 2%↑ in prediction accuracy over four datasets therein.

1. INTRODUCTION

Machine learning on graphs has recently attracted great attention in the community due to the ubiquitous graph-structured data and the associated inference and prediction problems (Zhu, 2005; Hamilton, 2020; Nickel et al., 2015) . Current works primarily focus on graphs which can model only pairwise relations in data. Emerging research has shown that higher-order relations that involve more than two entities often reveal more significant information in many applications (Benson et al., 2021; Schaub et al., 2021; Battiston et al., 2020; Lambiotte et al., 2019; Lee et al., 2021) . For example, higher-order network motifs build the fundamental blocks of many real-world networks (Mangan & Alon, 2003; Benson et al., 2016; Tsourakakis et al., 2017; Li et al., 2017; Li & Milenkovic, 2017) . Session-based (multi-step) behaviors often indicate the preferences of web users in more precise ways (Xia et al., 2021; Wang et al., 2020; 2021; 2022) . To capture these higher-order relations, hypergraphs provide a dedicated mathematical abstraction (Berge, 1984) . However, learning algorithms on hypergraphs are still far underdeveloped as opposed to those on graphs. Recently, inspired by the success of graph neural networks (GNNs), researchers have started investigating hypergraph neural network models (HNNs) (Feng et al., 2019; Yadati et al., 2019; Dong et al., 2020; Huang & Yang, 2021; Bai et al., 2021; Arya et al., 2020) . Compared with GNNs, designing HNNs is more challenging. First, as aforementioned, higher-order relations modeled by hyperedges could contain complex information. Second, hyperedges in real-world hypergraphs are often of large and irregular sizes. Therefore, how to effectively represent higher-order relations while efficiently processing those irregular hyperedges is the key challenge when to design HNNs. In this work, inspired by the recently developed hypergraph diffusion algorithms (Li et al., 2020a; Liu et al., 2021b; Fountoulakis et al., 2021; Takai et al., 2020; Tudisco et al., 2021a) we design a novel HNN architecture that holds provable expressiveness to approximate a large class of hypergraph diffusion while keeping computational efficiency. Hypergraph diffusion is significant due to

availability

//github.com/

