Authors
Avishek Joey Bose, Ankit Jain, Piero Molino, William L Hamilton
Publication date
2019/12/20
Journal
arXiv preprint arXiv:1912.09867
Description
We consider the task of few shot link prediction on graphs. The goal is to learn from a distribution over graphs so that a model is able to quickly infer missing edges in a new graph after a small amount of training. We show that current link prediction methods are generally ill-equipped to handle this task. They cannot effectively transfer learned knowledge from one graph to another and are unable to effectively learn from sparse samples of edges. To address this challenge, we introduce a new gradient-based meta learning framework, Meta-Graph. Our framework leverages higher-order gradients along with a learned graph signature function that conditionally generates a graph neural network initialization. Using a novel set of few shot link prediction benchmarks, we show that Meta-Graph can learn to quickly adapt to a new graph using only a small sample of true edges, enabling not only fast adaptation but also improved results at convergence.
Total citations
20202021202220232024914222512
Scholar articles
AJ Bose, A Jain, P Molino, WL Hamilton - arXiv preprint arXiv:1912.09867, 2019