Recursive conditioning

作者:

摘要

We introduce an any-space algorithm for exact inference in Bayesian networks, called recursive conditioning. On one extreme, recursive conditioning takes O(n) space and O(nexp(wlogn)) time—where n is the size of a Bayesian network and w is the width of a given elimination order—therefore, establishing a new complexity result for linear-space inference in Bayesian networks. On the other extreme, recursive conditioning takes O(nexp(w)) space and O(nexp(w)) time, therefore, matching the complexity of state-of-the-art algorithms based on clustering and elimination. In between linear and exponential space, recursive conditioning can utilize memory at increments of X-bytes, where X is the number of bytes needed to store a floating point number in a cache. Moreover, the algorithm is equipped with a formula for computing its average running time under any amount of space, hence, providing a valuable tool for time–space tradeoffs in demanding applications. Recursive conditioning is therefore the first algorithm for exact inference in Bayesian networks to offer a smooth tradeoff between time and space, and to explicate a smooth, quantitative relationship between these two important resources.

论文关键词:Bayesian networks,Probabilistic inference,Time–space tradeoff,Conditioning methods,Decompositional reasoning

论文评审过程:Available online 16 March 2001.

论文官网地址:https://doi.org/10.1016/S0004-3702(00)00069-2