Latest YouTube Video

Monday, June 13, 2016

Reweighted Data for Robust Probabilistic Models. (arXiv:1606.03860v1 [stat.ML])

Probabilistic models analyze data by relying on a set of assumptions. When a model performs poorly, we challenge its assumptions. This approach has led to myriad hand-crafted robust models; they offer protection against small deviations from their assumptions. We propose a simple way to systematically mitigate mismatch of a large class of probabilistic models. The idea is to raise the likelihood of each observation to a weight. Inferring these weights allows a model to identify observations that match its assumptions; down-weighting others enables robust inference and improved predictive accuracy. We study four different forms of model mismatch, ranging from missing latent groups to structure misspecification. A Poisson factorization analysis of the Movielens dataset shows the benefits of reweighting in a real data scenario.



from cs.AI updates on arXiv.org http://ift.tt/1YorIoY
via IFTTT

No comments: