Latest YouTube Video

Tuesday, November 24, 2015

Generalized Product of Experts for Automatic and Principled Fusion of Gaussian Process Predictions. (arXiv:1410.7827v2 [cs.LG] UPDATED)

In this work, we propose a generalized product of experts (gPoE) framework for combining the predictions of multiple probabilistic models. We identify four desirable properties that are important for scalability, expressiveness and robustness, when learning and inferring with a combination of multiple models. Through analysis and experiments, we show that gPoE of Gaussian processes (GP) have these qualities, while no other existing combination schemes satisfy all of them at the same time. The resulting GP-gPoE is highly scalable as individual GP experts can be independently learned in parallel; very expressive as the way experts are combined depends on the input rather than fixed; the combined prediction is still a valid probabilistic model with natural interpretation; and finally robust to unreliable predictions from individual experts.



from cs.AI updates on arXiv.org http://ift.tt/1tiGezu
via IFTTT

No comments: