Learning Bayesian Networks with Local Structure

Nir Friedman, Moises Goldszmidt

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

We examine a novel addition to the known methods for learning Bayesian networks from data that improves the quality of the learned networks. Our approach explicitly represents and learns the local structure in the conditional probability distributions (CPDs) that quantify these networks. This increases the space of possible models, enabling the representation of CPDs with a variable number of parameters. The resulting learning procedure induces models that better emulate the interactions present in the data. We describe the theoretical foundations and practical aspects of learning local structures and provide an empirical evaluation of the proposed learning procedure. This evaluation indicates that learning curves characterizing this procedure converge faster, in the number of training instances, than those of the standard procedure, which ignores the local structure of the CPDs. Our results also show that networks learned with local structures tend to be more complex (in terms of arcs), yet require fewer parameters.
Original languageEnglish
Title of host publicationLearning in Graphical Models
EditorsMichael I. Jordan
Place of PublicationDordrecht
PublisherSpringer Netherlands
Pages421-459
Number of pages39
ISBN (Electronic)978-94-011-5014-9
ISBN (Print)978-94-010-6104-9
DOIs
StatePublished - 1998

Publication series

NameNATO ASI series. Series D, Behavioural and social sciences
PublisherSpringer
Volume89
ISSN (Print)0258-123X

Keywords

  • Bayesian Network

Fingerprint

Dive into the research topics of 'Learning Bayesian Networks with Local Structure'. Together they form a unique fingerprint.

Cite this