Linking online planning for MDPs with their special case of stochastic multi-armed bandit problems, we analyze three state-of-the-art Monte-Carlo tree search algorithms: UCT, BRUE, and MaxUCT. Using the outcome, we (i) introduce two new MCTS algorithms, MaxBRUE, which combines uniform sampling with Bellman backups, and MpaUCT, which combines UCBl with a novel backup procedure, (ii) analyze them formally and empirically, and (iii) show how MCTS algorithms can be further stratified by an exploration control mechanism that improves their empirical performance without harming the formal guarantees.
CITATION STYLE
Feldman, Z., & Domshlak, C. (2014). On MABs and separation of concerns in monte-carlo planning for MDPs. In Proceedings International Conference on Automated Planning and Scheduling, ICAPS (Vol. 2014-January, pp. 120–127). AAAI press. https://doi.org/10.1609/icaps.v24i1.13631
Mendeley helps you to discover research relevant for your work.