In Bayesian statistics probability distributions express beliefs. However, for many problems the beliefs cannot be computed analytically and approximations of beliefs are needed. We seek a ranking function that quantifies how "embarrassing" it is to communicate a given approximation. We show that there is only one ranking under the requirements that (1) the best ranked approximation is the non-approximated belief and (2) that the ranking judges approximations only by their predictions for actual outcomes. We find that this ranking is equivalent to the Kullback-Leibler divergence that is frequently used in the literature. However, there seems to be confusion about the correct order in which its functional arguments, the approximated and non-approximated beliefs, should be used. We hope that our elementary derivation settles the apparent confusion. We show for example that when approximating beliefs with Gaussian distributions the optimal approximation is given by moment matching. This is in contrast to many suggested computational schemes.
from cs.AI updates on arXiv.org http://ift.tt/2fuEmUW
via IFTTT
No comments:
Post a Comment