Authors
Peter Dayan, Geoffrey E Hinton, Radford M Neal, Richard S Zemel
Publication date
1995/9/1
Journal
Neural computation
Volume
7
Issue
5
Pages
889-904
Publisher
MIT Press
Description
Discovering the structure inherent in a set of patterns is a fundamental aim of statistical inference or learning. One fruitful approach is to build a parameterized stochastic generative model, independent draws from which are likely to produce the patterns. For all but the simplest generative models, each pattern can be generated in exponentially many ways. It is thus intractable to adjust the parameters to maximize the probability of the observed patterns. We describe a way of finessing this combinatorial explosion by maximizing an easily computed lower bound on the probability of the observations. Our method can be viewed as a form of hierarchical self-supervised learning that may relate to the function of bottom-up and top-down cortical processing pathways.
Total citations
1995199619971998199920002001200220032004200520062007200820092010201120122013201420152016201720182019202020212022202320241327313737192524332423232221282419427771738910311410714012812711262
Scholar articles
P Dayan, GE Hinton, RM Neal, RS Zemel - Neural computation, 1995
D Peter - Neural computation, 1995