Authors
Yan Shi, Jun-Xiong Cai, Yoli Shavit, Tai-Jiang Mu, Wensen Feng, Kai Zhang
Publication date
2022
Conference
Proceedings of the IEEE/CVF conference on computer vision and pattern recognition
Pages
12517-12526
Description
Graph Neural Networks (GNNs) with attention have been successfully applied for learning visual feature matching. However, current methods learn with complete graphs, resulting in a quadratic complexity in the number of features. Motivated by a prior observation that self-and cross-attention matrices converge to a sparse representation, we propose ClusterGNN, an attentional GNN architecture which operates on clusters for learning the feature matching task. Using a progressive clustering module we adaptively divide keypoints into different subgraphs to reduce redundant connectivity, and employ a coarse-to-fine paradigm for mitigating miss-classification within images. Our approach yields a 59.7% reduction in runtime and 58.4% reduction in memory consumption for dense detection, compared to current state-of-the-art GNN-based matching, while achieving a competitive performance on various computer vision tasks.
Total citations
20222023202453437
Scholar articles
Y Shi, JX Cai, Y Shavit, TJ Mu, W Feng, K Zhang - Proceedings of the IEEE/CVF conference on computer …, 2022