Causal Inference Despite Limited Global Confounding via Mixture Models

Spencer Gordon, Bijan Mazaheri, Yuval Rabani, Leonard Schulman

Research output: Contribution to journalConference articlepeer-review

2 Scopus citations

Abstract

A Bayesian Network is a directed acyclic graph (DAG) on a set of n random variables (the vertices); a Bayesian Network Distribution (BND) is a probability distribution on the random variables that is Markovian on the graph. A finite k-mixture of such models is graphically represented by a larger graph which has an additional “hidden” (or “latent”) random variable U, ranging in {1, ..., k}, and a directed edge from U to every other vertex. Models of this type are fundamental to causal inference, where U models an unobserved confounding effect of multiple populations, obscuring the causal relationships in the observable DAG. By solving the mixture problem and recovering the joint probability distribution with U, traditionally unidentifiable causal relationships become identifiable. Using a reduction to the more well-studied “product” case on empty graphs, we give the first algorithm to learn mixtures of non-empty DAGs.

Original languageEnglish
Pages (from-to)574-601
Number of pages28
JournalProceedings of Machine Learning Research
Volume213
StatePublished - 2023
Event2nd Conference on Causal Learning and Reasoning, CLeaR 2023 - Tubingen, Germany
Duration: 11 Apr 202314 Apr 2023

Bibliographical note

Publisher Copyright:
© 2023 S. Gordon, B. Mazaheri, Y. Rabani & L. Schulman.

Keywords

  • Bayesian networks
  • causal DAGs
  • causal identifiability
  • clobal confounding
  • hidden confounder
  • mixture models
  • population confounder

Fingerprint

Dive into the research topics of 'Causal Inference Despite Limited Global Confounding via Mixture Models'. Together they form a unique fingerprint.

Cite this