Probability that I am better that my opponent, given $n$ wins out of $m$

103 Views Asked by At

So I'm assuming that I have some static probability ($p$) of winning a given game against my opponent. I am also assuming that the wins are independent of one another.

My intuition is that this can be calculated by taking the integral of the binomial distribution from $0.5$ to $1$, and dividing that by the integral of the binomial distribution from $0$ to $1$.

For example: given 33 wins in 58 games:$$P(p > 0.5 \mid 33\text{ wins}, 58\text{ games}) = {\int_{0.5}^1 {58 \choose 33}p^{33}(1-p)^{58-33}\,dp\over \int_0^1 {58 \choose 33}p^{33}(1-p)^{58-33}\,dp} \approx 0.851 $$

Is this correct? And Suppose I have some prior belief about my edge (I think I have a 95% edge on this opponent, say): how would I plug these values into bayes? I'm not sure what to put in the demonimator in that case.

Finally, my opponent, whom I am taking a jab at by calculating this, will surely protest that the wins are dependent upon one another. And indeed they are: the numbers come from backgammon games played with a doubling cube. I assume that this makes the actual calculation practically impossible, but that if I did, the probability will be lower than the one calculated here. Is my intuition correct?

2

There are 2 best solutions below

0
On BEST ANSWER

The probability you win a game is $\pi.$ This is a random variable with a beta prior distribution with known parameters $(\alpha,\beta).$ We observe the results of $n$ games of which you win $w$ and lose $\ell, n=w+\ell.$ These games are assumed to be independent Bernoulli outcomes given the value of $\pi.$ The posterior distribution of $\pi$ is still beta but with parameters $(\alpha+w,\beta+\ell).$

$$P(\pi >0.5)=\frac{\Gamma(\alpha+\beta +w+\ell)}{\Gamma(\alpha+w)\Gamma(\beta+\ell)} \int_{0.5}^1 p^{\alpha+w-1}(1-p)^{\beta+\ell-1}dp $$

In Excel this is: $1$-Beta.Dist$(0.5,\alpha+w,\beta+\ell,1).$

The values of $\alpha>0$ and $\beta>0$ represent our prior knowledge about $\pi.$ Typically they would be the results of prior matches, although they do not have to be integers. If $\alpha=\beta$ then we have evenly matched players. If the common value is $1$ then $\pi$ has a uniform prior. If the common value is, say, $100$ then the prior distribution is still symmetric about $0.5$ but with much less spread. A small value of $\alpha+\beta$ represents uncertain information about $\pi$ since just a few wins and losses will make the posterior distribution essentially independent of the prior. To help choose $\alpha, \beta:$

$$E(\pi)=\frac {\alpha}{\alpha+\beta}, Var(\pi)=\frac{\alpha\beta}{(\alpha+\beta)^2(\alpha+\beta+1)}$$

$\alpha=25, \beta=15$ and $\alpha=180, \beta=150$ are examples of prior distributions with $P(\pi > 0.5)=0.95$ Although I wouldn't refer to this as a "95% edge." Maybe $E(\pi)=0.95$

2
On

This is the problem considered by the Reverend Thomas Bayes in in the 18th century in his famous paper "An Essay toward Solving a Problem in the Doctrine of Chances".

He supposed that the probability $\mathcal P$ of success on each trial is uniformly distributed between $0\text{ and }1$.

If successes on different trials are independent given the particular value of $\mathcal P$, regardless of which number between $0$ and $1$ that is, then they are not independent overall, for the following reason: Your probability of success on the first trial is $1/2$, but if you succeed on the first $20$ trials, then it is probable that $\mathcal P$ is closee to $1$; hence probable that you will succeed on your $21$st trial. In that sense the event of success on the $21$st trial is not independent of the events of success on the first $20$ trials.

Suppose you have a prior probability distribution $f(p)\,dp$, (so that $f$ is the density function).

The likelihood function (modern terminology not used by Bayes) is $L(p)={68\choose33}p^{33}(1-p)^{68-33}$. Bayes said that the posterior probability distribution is $c f(p) {68\choose33}p^{33}(1-p)^{68-33} \,dp$, where the normalizing constant $c$ is whatever number will make the integral equal to $1$.

In the problem you consider, you don't need either $c$ or ${68\choose33}$ since that appears in both the numerator and the denominator and cancels out.

Your bottom line, $0.851$ is too big. Since $33$ is less than half of $68$, you should end up with a number less than $1/2$.