We present a novel framework for performing statistical sampling, expectation estimation, and partition function approximation using \emph{arbitrary} heuristic stochastic processes defined over discrete state spaces. Using a highly parallel construction we call the \emph{sequential constraining process}, we are able to simultaneously generate states with the heuristic process and accurately estimate their probabilities, even when they are far too small to be realistically inferred by direct counting. After showing that both theoretically correct importance sampling and Markov chain Monte Carlo are possible using the sequential constraining process, we integrate it into a methodology called \emph{state space sampling}, extending the ideas of state space search from computer science to the sampling context. The methodology comprises a dynamic data structure that constructs a robust Bayesian model of the statistics generated by the heuristic process subject to an accuracy constraint, the posterior Kullback-Leibler divergence. Sampling from the dynamic structure will generally yield partial states, which are completed by recursively calling the heuristic to refine the structure and resuming the sampling. Our experiments on various Ising models suggest that state space sampling enables heuristic state generation with accurate probability estimates, demonstrated by illustrating the convergence of a simulated annealing process to the Boltzmann distribution with increasing run length. Consequently, heretofore unprecedented direct importance sampling using the \emph{final} (marginal) distribution of a generic stochastic process is allowed, potentially augmenting the range of algorithms at the Monte Carlo practitioner's disposal.
from cs.AI updates on arXiv.org http://ift.tt/1PC7dSV
via IFTTT
No comments:
Post a Comment