Authors
Kristian Kersting, Youssef El Massaoudi, Fabian Hadiji, Babak Ahmadi
Publication date
2010/7/4
Journal
Proceedings of the AAAI Conference on Artificial Intelligence
Volume
24
Issue
1
Pages
1181-1186
Description
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the effective application of probabilistic relational models to realistic real world tasks. Recently, lifted belief propagation (LBP) has been proposed as an efficient approximate solution of this inference problem. It runs a modified BP on a lifted network where nodes have been grouped together if they have—roughly speaking—identical computation trees, the tree-structured “unrolling” of the underlying graph rooted at the nodes. In many situations, this purely syntactic criterion is too pessimistic: message errors decay along paths. Intuitively, for a long chain graph with weak edge potentials, distant nodes will send and receive identical messages yet their computation trees are quite different. To overcome this, we propose iLBP, a novel, easy-to-implement, informed LBP approach that interleaves lifting and modified BP iterations. In turn, we can efficiently monitor the true BP messages sent and received in each iteration and group nodes accordingly. As our experiments show, iLBP can yield significantly faster more lifted network while not degrading performance. Above all, we show that iLBP is faster than BP when solving the problem of distributing data to a large network, an important real-world application where BP is faster than uninformed LBP.
Total citations
20102011201220132014201520162017201820192020202112745421331
Scholar articles
K Kersting, Y El Massaoudi, F Hadiji, B Ahmadi - Proceedings of the AAAI Conference on Artificial …, 2010