Coin toss, conditional expectation martingale question

927 Views Asked by At

Suppose there are two identical looking coins $A$ and $B$ and that the probabilities of heads are $P(H|A)=a$ and $P(H|B)=b$ with $a\ne b$. One of these coins is randomly selected and tossed infinitely often, and after every toss the outcome is observed. Let $X_n$ denote the updated probability that the coin we chose is A after the $n^{th}$ toss.

$\{X_n\}$ is a martingale if $E(X_n|X_1,...,X_{n−1})=X_{n−1}. $ Show that $\{X_n\}$ is a martingale.

So my motivation is to show that $\{X_n\}$ is a sequence that converges to 1 or 0 almost surely.

I defined $Y_n=\mathbb{1}$(The n-th toss result is heads). So $\sum\frac{Y_i}{n}$ converges to $a$ or $b$ by the strong law of large numbers. Which means that there exists N s.t for every n>N, $\sum\frac{Y_i}{n}\in (a-\epsilon,a+\epsilon)$ or $(b-\epsilon,b+\epsilon)$.

I want to say that depending on which neighborhood $\sum\frac{Y_i}{n}$ is in then $X_n$ converges to 1 or 0 and then that because of that we can derive directly $E(X_n|X_1,..X_{n-1})=X_{n-1} $ for all $n>N+1$ .

The last part though, I'm having problems proving it formally. Can anyone help me with that?

Edit: I see now that the definition of Martingale is suppose to hold for every n and not just starting from some N, so I guess my solution is useless, sorry if you wasted your time reading it