Authors
Gao Huang, Zhuang Liu, Geoff Pleiss, Laurens Van Der Maaten, Kilian Q Weinberger
Publication date
2019/5/23
Journal
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)
Volume
44
Issue
12
Pages
8704-8716
Publisher
IEEE
Description
Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output. In this paper, we embrace this observation and introduce the Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion. Whereas traditional convolutional networks with layers have connections—one between each layer and its subsequent layer—our network has direct connections. For each layer, the feature-maps of all preceding layers are used as inputs, and its own feature-maps are used as inputs into all subsequent layers. DenseNets have several compelling advantages: they alleviate the vanishing-gradient problem, encourage feature reuse and substantially improve parameter efficiency. We evaluate our proposed …
Total citations
201820192020202120222023202423469810714497
Scholar articles
G Huang, Z Liu, G Pleiss, L Van Der Maaten… - IEEE transactions on pattern analysis and machine …, 2019