Authors
Charles Dickens, Edward Huang, Aishwarya Reganti, Jiong Zhu, Karthik Subbian, Danai Koutra
Publication date
2024/5/13
Book
Companion Proceedings of the ACM on Web Conference 2024
Pages
1502-1510
Description
Graph summarization as a preprocessing step is an effective and complementary technique for scalable graph neural network (GNN) training. In this work, we propose the Coarsening Via Convolution Matching (ConvMatch) algorithm and a highly scalable variant, A-ConvMatch, for creating summarized graphs that preserve the output of graph convolution. We evaluate ConvMatch on six real-world link prediction and node classification graph datasets, and show it is efficient and preserves prediction performance while significantly reducing the graph size. Notably, ConvMatch achieves up to 95% of the prediction performance of GNNs on node classification while trained on graphs summarized down to 1% the size of the original graph. Furthermore, on link prediction tasks, ConvMatch consistently outperforms all baselines, achieving up to a 2X improvement.
Total citations
Scholar articles
C Dickens, E Huang, A Reganti, J Zhu, K Subbian… - Companion Proceedings of the ACM on Web …, 2024