Authors
Dan Roth, Heng Ji, Ming-Wei Chang, Taylor Cassidy
Publication date
2014
Description
Contextual disambiguation and grounding of concepts and entities in natural language text are essential to moving forward in many natural language understanding related tasks and are fundamental to many applications. The Wikification task (Bunescu and Pasca, 2006; Mihalcea and Csomai, 2007; Ratinov et al., 2011) aims at automatically identifying concept mentions appearing in a text document and link it to (or “ground it in”) a concept referent in a knowledge base (KB)(eg, Wikipedia). For example, consider the sentence," The Times report on Blumental (D) has the potential to fundamentally reshape the contest in the Nutmeg State.", a Wikifier should identify the key entities and concepts (Times, Blumental, D and the Nutmeg State), and disambiguate them by mapping them to an encyclopedic resource revealing, for example, that “D” here represents the Democratic Party, and that “the Nutmeg State” refers Connecticut.
Wikification may benefit both human end-users and Natural Language Processing (NLP) systems. When a document is Wikified a reader can more easily comprehend it, as information about related topics and relevant enriched knowledge from a KB is readily accessible. From a system-to-system perspective, a Wikified document conveys the meanings of its key concepts and entities by grounding them in an encyclopedic resource or a structurally rich ontology. Indeed, there is evidence that Wikification output can improve broad NLP down-stream tasks, including coreference resolution (Ratinov and Roth, 2012), text classification (Gabrilovich and Markovitch, 2007; Chang et al., 2008; Vitale et al., 2012), and applications such as …
Total citations
201520162017201820192020202120222023246143733
Scholar articles