Latest YouTube Video

Thursday, August 11, 2016

Online Context-Dependent Clustering in Recommendations based on Exploration-Exploitation Algorithms. (arXiv:1608.03544v1 [cs.LG])

We investigate two context-dependent clustering techniques for content recommendation based on exploration-exploitation strategies in contextual multi-armed bandit settings. Our algorithms dynamically group users based on the items under consideration and, possibly, group items based on the similarity of the clusterings induced over the users. The resulting algorithm thus takes advantage of preference patterns in the data in a way akin to collaborative filtering methods. We provide an empirical analysis on extensive real-world datasets, showing scalability and increased prediction performance over state-of-the-art methods for clustering bandits. For one of the two algorithms we also give a regret analysis within a standard linear stochastic noise setting.



from cs.AI updates on arXiv.org http://ift.tt/2bmj86F
via IFTTT

No comments: