In my probability theory lecture the model of a coin toss with a random uniform distributed coin was talked about, which means the probability $(0,1) $ for which a coin shows heads is randomly chosen. Then this coin is thrown $n$ times.
It was already shown that the probability of $Y_n$ times heads in the first $n$ coinflips can be described as $$ P(Y_n=k)= \binom{n}{k}\int^1_0 \theta^k(1-\theta)^{n-k} d\theta$$ So far so good, but now I got an exercise to show that
1.) a random variable $Y$ exists such that $$\lim_{n\rightarrow\infty} \frac{Y_n}{n} = Y \text{ and } \mathbb{E}[Y]=\frac{1}{2},$$
2.) $$N_n^\theta:=\frac{(n+1)!}{Y_n!(n-Y_n)!}\theta^{Y_n}(1-\theta)^{n-Y_n}$$ defines a martingale for all $0<\theta<1$.
I don't know where to start with 1.), and plugging the equation into the martingale property doesn't provide an obvious approach to get the desired result for me.
Any help would be appreciated, thanks in advance!