TY - GEN

T1 - Dimension reduction in singularly perturbed continuous-time Bayesian networks

AU - Friedman, Nir

AU - Kupferman, Raz

PY - 2006

Y1 - 2006

N2 - Continuous-time Bayesian networks (CTBNs) are graphical representations of multi-component continuous-time Markov processes as directed graphs. The edges in the network represent direct influences among components. The joint rate matrix of the multi-component process is specified by means of conditional rate matrices for each component separately. This paper addresses the situation where some of the components evolve on a time scale that is much shorter compared to the time scale of the other components. We prove that in the limit where the separation of scales is infinite, the Markov process converges (in distribution, or weakly) to a reduced, or effective Markov process that only involves the slow components. We also demonstrate that for a reasonable separation of scales (an order of magnitude) the reduced process is a good approximation of the marginal process over the slow components. We provide a simple procedure for building a reduced CTBN for this effective process, with conditional rate matrices that can be directly calculated from the original CTBN, and discuss the implications for approximate reasoning in large systems.

AB - Continuous-time Bayesian networks (CTBNs) are graphical representations of multi-component continuous-time Markov processes as directed graphs. The edges in the network represent direct influences among components. The joint rate matrix of the multi-component process is specified by means of conditional rate matrices for each component separately. This paper addresses the situation where some of the components evolve on a time scale that is much shorter compared to the time scale of the other components. We prove that in the limit where the separation of scales is infinite, the Markov process converges (in distribution, or weakly) to a reduced, or effective Markov process that only involves the slow components. We also demonstrate that for a reasonable separation of scales (an order of magnitude) the reduced process is a good approximation of the marginal process over the slow components. We provide a simple procedure for building a reduced CTBN for this effective process, with conditional rate matrices that can be directly calculated from the original CTBN, and discuss the implications for approximate reasoning in large systems.

UR - http://www.scopus.com/inward/record.url?scp=80053191304&partnerID=8YFLogxK

M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???

AN - SCOPUS:80053191304

SN - 0974903922

SN - 9780974903927

T3 - Proceedings of the 22nd Conference on Uncertainty in Artificial Intelligence, UAI 2006

SP - 182

EP - 191

BT - Proceedings of the 22nd Conference on Uncertainty in Artificial Intelligence, UAI 2006

T2 - 22nd Conference on Uncertainty in Artificial Intelligence, UAI 2006

Y2 - 13 July 2006 through 16 July 2006

ER -