My wife and I were both terrible at probability at school. So, we pretty much gave up straight away on this one.
I wanted to know how the odds in a game of chance change (if at all) as more games are played.
As a simple example, let's say each discrete game has odds of 1:2 (like the flip of a coin). I understand that the odds of each game is 1 in 2.
If player 1 plays 1 game and player 2 plays 100 games, surely player 2 has a better chance of winning at least 1 game.
But I can't figure out the math (see above - bad at math).
Can anyone sort this out for us. Thanks!!!
If the probability of winning one game is $p$, and games are independent of each other, then when you play $n$ games the probability that you win none of them is equal to $(1-p)^n$ - that's a probability of $1-p$ of losing each game, multiplied together. The probability that you win at least one of the games is then $1-(1-p)^n$, since it's the probability that you don't lose all of them. So if $p = 0.5$ and $n = 100$, the probability you'll win at least one game is very close to 1 (it's 0.9 followed by another 30 or so 9s).
There are a lot of other standard things you can do to look at what happens in these games. For example, the probability that you win $k$ of the games is given by the binomial distribution - $P(X = k) = {n \choose k} p^k (1-p)^{n-k}$, where ${n \choose k} = \frac{n(n-1)(n-2)\ldots(n-k+1)}{k(k-1)(k-2)\ldots 1}$ is the binomial coefficient.
More simply, you can look at the expected number of games you'll win - it's just $np$, i.e. you'll (on average) win the same fraction of games as the probability of winning any one game, so if $n = 100$ and $p = 0.5$, you'll win about 50 of them, or if we consider some reasonable variability it's reasonable to expect you'll win 40-60 of the games.