Authors
Raja Giryes, Guillermo Sapiro, Alex M Bronstein
Publication date
2016/3/23
Journal
IEEE Transactions on Signal Processing
Volume
64
Issue
13
Pages
3444-3457
Publisher
IEEE
Description
Three important properties of a classification machinery are i) the system preserves the core information of the input data; ii) the training examples convey information about unseen data; and iii) the system is able to treat differently points from different classes. In this paper, we show that these fundamental properties are satisfied by the architecture of deep neural networks. We formally prove that these networks with random Gaussian weights perform a distance-preserving embedding of the data, with a special treatment for in-class and out-of-class data. Similar points at the input of the network are likely to have a similar output. The theoretical analysis of deep networks here presented exploits tools used in the compressed sensing and dictionary learning literature, thereby making a formal connection between these important topics. The derived results allow drawing conclusions on the metric learning properties of …
Total citations
201420152016201720182019202020212022202320241512263634323318309
Scholar articles
R Giryes, G Sapiro, AM Bronstein - arXiv preprint arXiv:1412.5896, 2014