Open Access

Are Multi-Armed Bandits Susceptible to Peeking?

   | Jul 03, 2018

Cite

Yu, E.C., Sprenger, A.M., Thomas, R.P., & Dougherty, M.R. (2014). When Decision Heuristics and Science Collide. Psychonomic Bulletin & Review 21 (2) (pp. 268-82). Maryland, USA: Springer.10.3758/s13423-013-0495-zSearch in Google Scholar

Lotze, T., & Loecher, M. (2014). Bandit: Functions for Simple a/B Split Test and Multi-Armed Bandit Analysis. https://CRAN.R-project.org/package=bandit.Search in Google Scholar

R Core Team. (2016). R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved August 2017 from https://www.R-project.org/.Search in Google Scholar

Robinson, D. (2015). Is Bayesian a/B Testing Immune to Peeking? Not Exactly. Retrieved August 2017 from http://varianceexplained.org/r/bayesian-ab-testing/.Search in Google Scholar

Rouder, J. N. (2014). Optional Stopping: No Problem for Bayesians. Psychonomic Bulletin & Review 21 (2) (pp. 301-8). Missouri, USA: Springer10.3758/s13423-014-0595-4Search in Google Scholar

Sanborn, A. N, & Hills, T. T. (2014). The Frequentist Implications of Optional Stopping on Bayesian Hypothesis Tests. Psychonomic Bulletin & Review 21 (2) (pp. 283-300). Coventry, UK: Springer.10.3758/s13423-013-0518-9Open DOISearch in Google Scholar

Scott, S. L. (2010). A Modern Bayesian Look at the Multi-Armed Bandit. Applied Stochastic Models in Business and Industry 26 (6) (pp. 639-58). Wiley Online Library10.1002/asmb.874Search in Google Scholar

Google Analytics (2012). Google Analytics Help Page. Retrieved August 2017 from https://support.google.com/analytics/answer/2844870?hl=en.Search in Google Scholar