Authors
Seunghak Yu, Sathish Reddy Indurthi, Seohyun Back, Haejun Lee
Publication date
2018
Journal
MRQA @ ACL 2018 (2018): 21.
Description
Reading Comprehension (RC) of text is one of the fundamental tasks in natural language processing. In recent years, several end-to-end neural network models have been proposed to solve RC tasks. However, most of these models suffer in reasoning over long documents. In this work, we propose a novel Memory Augmented Machine Comprehension Network (MAMCN) to address long-range dependencies present in machine reading comprehension. We perform extensive experiments to evaluate proposed method with the renowned benchmark datasets such as SQuAD, QUASAR-T, and TriviaQA. We achieve the state of the art performance on both the document-level (QUASAR-T, TriviaQA) and paragraph-level (SQuAD) datasets compared to all the previously published approaches.
Total citations
201920202021202220232024566334
Scholar articles
S Yu, SR Indurthi, S Back, H Lee - Proceedings of the workshop on machine reading for …, 2018