Authors
Xiao Liu, Shiyu Zhao, Kai Su, Yukuo Cen, Jiezhong Qiu, Mengdi Zhang, Wei Wu, Yuxiao Dong, Jie Tang
Publication date
2022/8/14
Book
Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
Pages
1120-1130
Description
Knowledge graph (KG) embeddings have been a mainstream approach for reasoning over incomplete KGs. However, limited by their inherently shallow and static architectures, they can hardly deal with the rising focus on complex logical queries, which comprise logical operators, imputed edges, multiple source entities, and unknown intermediate entities. In this work, we present the Knowledge Graph Transformer (kgTransformer) with masked pre-training and fine-tuning strategies. We design a KG triple transformation method to enable Transformer to handle KGs, which is further strengthened by the Mixture-of-Experts (MoE) sparse activation. We then formulate the complex logical queries as masked prediction and introduce a two-stage masked pre-training strategy to improve transferability and generalizability.Extensive experiments on two benchmarks demonstrate that kgTransformer can consistently outperform …
Total citations
20222023202432423
Scholar articles
X Liu, S Zhao, K Su, Y Cen, J Qiu, M Zhang, W Wu… - Proceedings of the 28th ACM SIGKDD Conference on …, 2022