INSTITUTIONAL DIGITAL REPOSITORY

Ballooning Multi-Armed bandits

Show simple item record

dc.contributor.author Ghalme *, G.
dc.contributor.author Dhamal, S.
dc.contributor.author Jain, S.
dc.contributor.author Gujar, S.
dc.contributor.author Narahari, Y.
dc.date.accessioned 2021-06-29T21:03:17Z
dc.date.available 2021-06-29T21:03:17Z
dc.date.issued 2021-06-30
dc.identifier.uri http://localhost:8080/xmlui/handle/123456789/1926
dc.description.abstract In this paper, we introduce ballooning multi-armed bandits (BL-MAB ), a novel extension of the classical stochastic MAB model. In the BL-MAB model, the set of available arms grows (or balloons) over time. In contrast to the classical MAB setting where the regret is computed with respect to the best arm overall, the regret in a BL-MAB setting is computed with respect to the best available arm at each time. We first observe that the existing stochastic MAB algorithms result in linear regret for the BL-MAB model. We prove that, if the best arm is equally likely to arrive at any time instant, a sub-linear regret cannot be achieved. Next, we show that if the best arm is more likely to arrive in the early rounds, one can achieve sub-linear regret. Our proposed algorithm determines (1) the fraction of the time horizon for which the newly arriving arms should be explored and (2) the sequence of arm pulls in the exploitation phase from among the explored arms. Making reasonable assumptions on the arrival distribution of the best arm in terms of the thinness of the distribution’s tail, we prove that the proposed algorithm achieves sub-linear instance-independent regret. We further quantify explicit dependence of regret on the arrival distribution parameters. We reinforce our theoretical findings with extensive simulation results. We conclude by showing that our algorithm would achieve sub-linear regret even if (a) the distributional parameters are not exactly known, but are obtained using a reasonable learning mechanism or (b) the best arm is not more likely to arrive early, but a large fraction of arms is likely to arrive relatively early en_US
dc.language.iso en_US en_US
dc.title Ballooning Multi-Armed bandits en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account