Authors
Seohyun Back, Seunghak Yu, Sathish Reddy Indurthi, Jihie Kim, Jaegul Choo
Publication date
2018
Conference
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Pages
2131-2140
Description
Machine reading comprehension helps machines learn to utilize most of the human knowledge written in the form of text. Existing approaches made a significant progress comparable to human-level performance, but they are still limited in understanding, up to a few paragraphs, failing to properly comprehend lengthy document. In this paper, we propose a novel deep neural network architecture to handle a long-range dependency in RC tasks. In detail, our method has two novel aspects:(1) an advanced memory-augmented architecture and (2) an expanded gated recurrent unit with dense connections that mitigate potential information distortion occurring in the memory. Our proposed architecture is widely applicable to other models. We have performed extensive experiments with well-known benchmark datasets such as TriviaQA, QUASAR-T, and SQuAD. The experimental results demonstrate that the proposed method outperforms existing methods, especially for lengthy documents.
Total citations
2019202020212022202335342
Scholar articles
S Back, S Yu, SR Indurthi, J Kim, J Choo - Proceedings of the 2018 Conference on Empirical …, 2018