Imagine we assume there are two different types of coins:
- Coin A: a fair coin, p(heads) = 0.5.
- Coin B: biased to heads at p(heads)=0.7.
We then want to learn from samples which coin we are flipping. Assume a naive prior over the two coins, so we have a Beta distribution, $\beta_0(1,1)$.
You flip the coin and see heads. Since you know the probability that coin A would generate heads is 0.5 and you know the probability that coin B would generate heads is 0.7, we update our distribution as:
$$ \beta_1 = (1+\frac{0.5}{1.2},1+\frac{0.7}{1.2}) \approx (1.4167, 1.5833) $$
Is this the correct way to update the distribution or will it improperly bias the distribution in some way?
I don't understand how a beta distribution enters into it. A beta distribution is usually used in the context of an unknown probability lying anywhere in $[0,1]$. We have no such parameter here; all we have is an unknown binary choice between coins $A$ and $B$. The most natural prior in this case is one that assigns probability $1/2$ to both coins. The a priori probability of flipping heads with coin $A$ was $\frac12\cdot0.5=0.25$, and the a priori probability of flipping heads with coin $B$ was $\frac12\cdot0.7=0.35$, so since heads was flipped the probability that the coin is coin $A$ is $0.25/(0.25+0.35)=5/12$ and the probability that the coin is coin $B$ is $0.35/(0.25+0.35)=7/12$.