So I'm assuming that I have some static probability ($p$) of winning a given game against my opponent. I am also assuming that the wins are independent of one another.
My intuition is that this can be calculated by taking the integral of the binomial distribution from $0.5$ to $1$, and dividing that by the integral of the binomial distribution from $0$ to $1$.
For example: given 33 wins in 58 games:$$P(p > 0.5 \mid 33\text{ wins}, 58\text{ games}) = {\int_{0.5}^1 {58 \choose 33}p^{33}(1-p)^{58-33}\,dp\over \int_0^1 {58 \choose 33}p^{33}(1-p)^{58-33}\,dp} \approx 0.851 $$
Is this correct? And Suppose I have some prior belief about my edge (I think I have a 95% edge on this opponent, say): how would I plug these values into bayes? I'm not sure what to put in the demonimator in that case.
Finally, my opponent, whom I am taking a jab at by calculating this, will surely protest that the wins are dependent upon one another. And indeed they are: the numbers come from backgammon games played with a doubling cube. I assume that this makes the actual calculation practically impossible, but that if I did, the probability will be lower than the one calculated here. Is my intuition correct?
The probability you win a game is $\pi.$ This is a random variable with a beta prior distribution with known parameters $(\alpha,\beta).$ We observe the results of $n$ games of which you win $w$ and lose $\ell, n=w+\ell.$ These games are assumed to be independent Bernoulli outcomes given the value of $\pi.$ The posterior distribution of $\pi$ is still beta but with parameters $(\alpha+w,\beta+\ell).$
$$P(\pi >0.5)=\frac{\Gamma(\alpha+\beta +w+\ell)}{\Gamma(\alpha+w)\Gamma(\beta+\ell)} \int_{0.5}^1 p^{\alpha+w-1}(1-p)^{\beta+\ell-1}dp $$
In Excel this is: $1$-Beta.Dist$(0.5,\alpha+w,\beta+\ell,1).$
The values of $\alpha>0$ and $\beta>0$ represent our prior knowledge about $\pi.$ Typically they would be the results of prior matches, although they do not have to be integers. If $\alpha=\beta$ then we have evenly matched players. If the common value is $1$ then $\pi$ has a uniform prior. If the common value is, say, $100$ then the prior distribution is still symmetric about $0.5$ but with much less spread. A small value of $\alpha+\beta$ represents uncertain information about $\pi$ since just a few wins and losses will make the posterior distribution essentially independent of the prior. To help choose $\alpha, \beta:$
$$E(\pi)=\frac {\alpha}{\alpha+\beta}, Var(\pi)=\frac{\alpha\beta}{(\alpha+\beta)^2(\alpha+\beta+1)}$$
$\alpha=25, \beta=15$ and $\alpha=180, \beta=150$ are examples of prior distributions with $P(\pi > 0.5)=0.95$ Although I wouldn't refer to this as a "95% edge." Maybe $E(\pi)=0.95$