Latest YouTube Video

Monday, May 2, 2016

Graph Clustering Bandits for Recommendation. (arXiv:1605.00596v1 [stat.ML])

We investigate an efficient context-dependent clustering technique for recommender systems based on exploration-exploitation strategies through multi-armed bandits over multiple users. Our algorithm dynamically groups users based on their observed behavioral similarity during a sequence of logged activities. In doing so, the algorithm reacts to the currently served user by shaping clusters around him/her but, at the same time, it explores the generation of clusters over users which are not currently engaged. We motivate the effectiveness of this clustering policy, and provide an extensive empirical analysis on real-world datasets, showing scalability and improved prediction performance over state-of-the-art methods for sequential clustering of users in multi-armed bandit scenarios.



from cs.AI updates on arXiv.org http://ift.tt/24kwd8Y
via IFTTT

No comments: