Usage Examples Bernoulli Bandits: a three-armed Bernoulli bandit simulation comparing epsilon-greedy, UCB1, and Thompson sampling strategies