In my last question, I asked about a weighted coin with probability of heads $p$. I received an answer that if we flip the coin twice, we get:
$$\Pr(p=p_i \mid F_1,F_2)=\dfrac{\Pr( F_2 \mid p=p_i)\Pr(p=p_i \mid F_1) }{\displaystyle \sum_j \Pr( F_2 \mid p=p_j)\Pr(p=p_j \mid F_1)}$$
I've had trouble proving this result. The numerator becomes:
$$\dfrac{\Pr(F_1, p=p_i)\cdot \Pr(F_2, p=p_i)}{\Pr(F_1)\cdot\Pr(p=p_i)}$$
but aside from assuming $F_1$ and $p=p_i$ to be independent (which certainly isn't true), I can't simplify this expression any further. The denominator similarly won't budge.
How can I demonstrate the above identity to be true?
Just use Bayes' rule and LOTP,
$\begin{align} P(p=p_{i}|F_{2})&=\frac{P(F_{2}|p=p_{i})\cdot{P(p=p_{i})}}{P(F_{2})}\\ &=\frac{P(F_{2}|p=p_{i})\cdot{P(p=p_{i})}}{\sum_{j}{P(F_{2}|p=p_{j})P(p=p_{j})}}\\ \end{align}$
Since you also know the result of the first flip, the above result with extra conditioning on $F_{1}$ becomes,
$\begin{align} P(p=p_{i}|F_{1},F_{2})&=\frac{P(F_{2}|F_{1},p=p_{i})\cdot{P(p=p_{i}|F_{1})}}{P(F_{2}|F_{1})}\\ &=\frac{P(F_{2}|F_{1},p=p_{i})\cdot{P(p=p_{i}|F_{1})}}{\sum_{j}{P(F_{2}|F_{1},p=p_{j})P(p=p_{j}|F_{1})}}\\ \end{align}$
As $F_{1}$ and $F_{2}$ are independent events and ${F_{1}}$ provides no information about $F_{2}$, $P(F_{2}|F_{1},p=p_{i})=P(F_{2}|p=p_{i})$. This gives the desired result.
$\begin{align} P(p=p_{i}|F_{1},F_{2})&=\frac{P(F_{2}|p=p_{i})\cdot{P(p=p_{i}|F_{1})}}{P(F_{2}|F_{1})}\\ &=\frac{P(F_{2}|p=p_{i})\cdot{P(p=p_{i}|F_{1})}}{\sum_{j}{P(F_{2}|p=p_{j})P(p=p_{j}|F_{1})}}\\ \end{align}$