Authors
Wenlin Chen, James Wilson, Stephen Tyree, Kilian Q Weinberger, Yixin Chen
Publication date
2016/8/13
Book
Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining
Pages
1475-1484
Description
Convolutional neural networks (CNN) are increasingly used in many areas of computer vision. They are particularly attractive because of their ability to "absorb" great quantities of labeled data through millions of parameters. However, as model sizes increase, so do the storage and memory requirements of the classifiers, hindering many applications such as image and speech recognition on mobile phones and other devices. In this paper, we present a novel net- work architecture, Frequency-Sensitive Hashed Nets (FreshNets), which exploits inherent redundancy in both convolutional layers and fully-connected layers of a deep learning model, leading to dramatic savings in memory and storage consumption. Based on the key observation that the weights of learned convolutional filters are typically smooth and low-frequency, we first convert filter weights to the frequency domain with a discrete cosine transform …
Total citations
2016201720182019202020212022202320241916201522202511
Scholar articles
W Chen, J Wilson, S Tyree, KQ Weinberger, Y Chen - Proceedings of the 22nd ACM SIGKDD international …, 2016