Authors
Aravind Sankar, Yanhong Wu, Liang Gou, Wei Zhang, Hao Yang
Publication date
2020/1/20
Book
Proceedings of the 13th International Conference on Web Search and Data Mining
Pages
519-527
Description
Learning node representations in graphs is important for many applications such as link prediction, node classification, and community detection. Existing graph representation learning methods primarily target static graphs while many real-world graphs evolve over time. Complex time-varying graph structures make it challenging to learn informative node representations over time.
We present Dynamic Self-Attention Network (DySAT), a novel neural architecture that learns node representations to capture dynamic graph structural evolution. Specifically, DySAT computes node representations through joint self-attention along the two dimensions of structural neighborhood and temporal dynamics. Compared with state-of-the-art recurrent methods modeling graph evolution, dynamic self-attention is efficient, while achieving consistently superior performance. We conduct link prediction experiments on two graph types …
Total citations
20202021202220232024267313118397
Scholar articles
A Sankar, Y Wu, L Gou, W Zhang, H Yang - Proceedings of the 13th international conference on …, 2020