Existence of random variable at coin toss and martingale

159 Views Asked by At

In my probability theory lecture the model of a coin toss with a random uniform distributed coin was talked about, which means the probability $(0,1) $ for which a coin shows heads is randomly chosen. Then this coin is thrown $n$ times.

It was already shown that the probability of $Y_n$ times heads in the first $n$ coinflips can be described as $$ P(Y_n=k)= \binom{n}{k}\int^1_0 \theta^k(1-\theta)^{n-k} d\theta$$ So far so good, but now I got an exercise to show that

1.) a random variable $Y$ exists such that $$\lim_{n\rightarrow\infty} \frac{Y_n}{n} = Y \text{ and } \mathbb{E}[Y]=\frac{1}{2},$$

2.) $$N_n^\theta:=\frac{(n+1)!}{Y_n!(n-Y_n)!}\theta^{Y_n}(1-\theta)^{n-Y_n}$$ defines a martingale for all $0<\theta<1$.

I don't know where to start with 1.), and plugging the equation into the martingale property doesn't provide an obvious approach to get the desired result for me.

Any help would be appreciated, thanks in advance!

1

There are 1 best solutions below

1
On
  1. For any $n$, $$E[Y_n] = \sum_{k=0}^n k\,P(Y_n=k)= \sum_{k=0}^nk\,\binom{n}{k}\int^1_0 \theta^k(1-\theta)^{n-k} d\theta$$ $$=\int^1_0d\theta\,\sum_{k=0}^nk\,\binom{n}{k} \theta^k(1-\theta)^{n-k} =\int^1_0d\theta\,n\theta ={1\over 2}n$$ with the sum in the second line being the expectation for a binomial. So for any $n>0$, $$E\left[{Y_n\over n}\right]={1\over2}$$