Authors
Sinead Williamson, Avinava Dubey, Eric Xing
Publication date
2013
Conference
Proceedings of the 30th International Conference on Machine Learning
Pages
98-106
Description
Nonparametric mixture models based on the Dirichlet process are an elegant alternative to finite models when the number of underlying components is unknown, but inference in such models can be slow. Existing attempts to parallelize inference in such models have relied on introducing approximations, which can lead to inaccuracies in the posterior estimate. In this paper, we describe auxiliary variable representations for the Dirichlet process and the hierarchical Dirichlet process that allow us to perform MCMC using the correct equilibrium distribution, in a distributed manner. We show that our approach allows scalable inference without the deterioration in estimate quality that accompanies existing methods.
Total citations
2013201420152016201720182019202020212022202320249131211117879104
Scholar articles
S Williamson, A Dubey, E Xing - International Conference on Machine Learning, 2013
SA Williamson, A Dubey, EP Xing - arXiv preprint arXiv:1211.7120, 2012