共 50 条
- [44] Scaling Multi-Armed Bandit Algorithms KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 1449 - 1459
- [50] IMPROVING STRATEGIES FOR THE MULTI-ARMED BANDIT MARKOV PROCESS AND CONTROL THEORY, 1989, 54 : 158 - 163