Authors
Oren Melamud, Jacob Goldberger, Ido Dagan
Publication date
2016/8
Conference
Proceedings of the 20th SIGNLL conference on computational natural language learning
Pages
51-61
Description
Context representations are central to various NLP tasks, such as word sense disambiguation, named entity recognition, coreference resolution, and many more. In this work we present a neural model for efficiently learning a generic context embedding function from large corpora, using bidirectional LSTM. With a very simple application of our context representations, we manage to surpass or nearly reach state-of-the-art results on sentence completion, lexical substitution and word sense disambiguation tasks, while substantially outperforming the popular context representation of averaged word embeddings. We release our code and pretrained models, suggesting they could be useful in a wide variety of NLP tasks.
Total citations
20162017201820192020202120222023202453672110126128926939
Scholar articles
O Melamud, J Goldberger, I Dagan - Proceedings of the 20th SIGNLL conference on …, 2016