Recent work on weighted model counting has been very successfully applied to the problem of probabilistic inference in Bayesian networks. The probability distribution is encoded into a Boolean normal form and compiled to a target language, in order to represent local structure expressed among conditional probabilities more efficiently. We show that further improvements are possible, by exploiting the knowledge that is lost during the encoding phase and incorporating it into a compiler inspired by Satisfiability Modulo Theories. Constraints among variables are used as a background theory, which allows us to optimize the Shannon decomposition. We propose a new language, called Weighted Positive Binary Decision Diagrams, that reduces the cost of probabilistic inference by using this decomposition variant to induce an arithmetic circuit of reduced size.
from cs.AI updates on arXiv.org http://ift.tt/2dZMREG
via IFTTT
No comments:
Post a Comment