Authors
Dandan Li, Douglas Summers-Stay
Publication date
2020/6
Journal
Annals of Mathematics and Artificial Intelligence
Volume
88
Pages
533-547
Publisher
Springer International Publishing
Description
Word embedding models excel in measuring word similarity and completing analogies. Word embeddings based on different notions of context trade off strengths in one area for weaknesses in another. Linear bag-of-words contexts, such as in word2vec, can capture topical similarity better, while dependency-based word embeddings better encode functional similarity. By combining these two word embeddings using different metrics, we show how the best aspects of both approaches can be captured. We show state-of-the-art performance on standard word and relational similarity benchmarks.
Total citations
20221
Scholar articles
D Li, D Summers-Stay - Annals of Mathematics and Artificial Intelligence, 2020