Linear programming relaxations and belief propagation - An empirical study

Chen Yanover*, Talya Meltzer, Yair Weiss

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

130 Scopus citations

Abstract

The problem of finding the most probable (MAP) configuration in graphical models comes up in a wide range of applications. In a general graphical model this problem is NP hard, but various approximate algorithms have been developed. Linear programming (LP) relaxations are a standard method in computer science for approximating combinatorial problems and have been used for finding the most probable assignment in small graphical models. However, applying this powerful method to real-world problems is extremely challenging due to the large numbers of variables and constraints in the linear program. Tree-Reweighted Belief Propagation is a promising recent algorithm for solving LP relaxations, but little is known about its running time on large problems. In this paper we compare tree-reweighted belief propagation (TRBP) and powerful general-purpose LP solvers (CPLEX) on relaxations of real-world graphical models from the fields of computer vision and computational biology. We find that TRBP almost always finds the solution significantly faster than all the solvers in CPLEX and more importantly, TRBP can be applied to large scale problems for which the solvers in CPLEX cannot be applied. Using TRBP we can find the MAP configurations in a matter of minutes for a large range of real world problems.

Original languageEnglish
Pages (from-to)1887-1907
Number of pages21
JournalJournal of Machine Learning Research
Volume7
StatePublished - Sep 2006

Fingerprint

Dive into the research topics of 'Linear programming relaxations and belief propagation - An empirical study'. Together they form a unique fingerprint.

Cite this