Authors
Wenpeng Yin, Katharina Kann, Mo Yu, Hinrich Schütze
Publication date
2017/2/7
Journal
arXiv preprint arXiv:1702.01923
Description
Deep neural networks (DNN) have revolutionized the field of natural language processing (NLP). Convolutional neural network (CNN) and recurrent neural network (RNN), the two main types of DNN architectures, are widely explored to handle various NLP tasks. CNN is supposed to be good at extracting position-invariant features and RNN at modeling units in sequence. The state of the art on many NLP tasks often switches due to the battle between CNNs and RNNs. This work is the first systematic comparison of CNN and RNN on a wide range of representative NLP tasks, aiming to give basic guidance for DNN selection.
Total citations
2017201820192020202120222023202420126188191249210273109
Scholar articles
W Yin, K Kann, M Yu, H Schütze - arXiv preprint arXiv:1702.01923, 2017