Authors
Jorg Bornschein, Samira Shabanian, Asja Fischer, Yoshua Bengio
Publication date
2016/6/11
Conference
International Conference on Machine Learning
Pages
2511-2519
Publisher
PMLR
Description
Efficient unsupervised training and inference in deep generative models remains a challenging problem. One basic approach, called Helmholtz machine or Variational Autoencoder, involves training a top-down directed generative model together with a bottom-up auxiliary model used for approximate inference. Recent results indicate that better generative models can be obtained with better approximate inference procedures. Instead of improving the inference procedure, we here propose a new model, the bidirectional Helmholtz machine, which guarantees that the top-down and bottom-up distributions can efficiently invert each other. We achieve this by interpreting both the top-down and the bottom-up directed models as approximate inference distributions and by defining the model distribution to be the geometric mean of these two. We present a lower-bound for the likelihood of this model and we show that optimizing this bound regularizes the model so that the Bhattacharyya distance between the bottom-up and top-down approximate distributions is minimized. This approach results in state of the art generative models which prefer significantly deeper architectures while it allows for orders of magnitude more efficient likelihood estimation.
Total citations
2016201720182019202020212022202363364222
Scholar articles
J Bornschein, S Shabanian, A Fischer, Y Bengio - International Conference on Machine Learning, 2016