Let $U$ be a Uniform$[0,1]$ variate and conditioned on $U$, let $\{X_{n}\}_{n\geq 1}$ be iid $\operatorname{Bern}(U)$ variates. Then show that $E(U|\sigma(X_{1},...,X_{n}))\xrightarrow{a.s.} U$
My attempt(s): Firstly, I can immediately notice that $Y_{n}=E(U|\sigma(X_{1},...,X_{n}))$ is a Doob Martingale sequence that is uniformly bounded (by $1$) in $L^{\infty}$ . So, by Martingale convergence theorem $Y_{n}\xrightarrow{a.s\, ,\, L^{p},\,p<\infty} X $ for some $X\in L^{p}$ for all $p<\infty$. But I don't know how to show that $X$ is equal to $U$.
I also tried to use the method of moments as $U$ is compactly supported. I tried to show that $E(U^{m})=E(E(U^{m}|X_{1},...,X_{n}))\to E(X^{m})$ but the issue is that I need to consider $\bigg(E(U|X_{1},...,X_{n})\bigg)^{m}$ instead of $E(U^{m})$ and that we only have $\bigg(E(U|X_{1},...,X_{n})\bigg)^{m}\xrightarrow{L^{m}} X^{m}$. This does not give us that $X^{m}$ has the same moments as $U^{m}$.
I also know from the Convergence Theory that $Y_{n}=E(X|X_{1},...,X_{n})$ which would mean that $E(X|X_{1},...,X_{n})=E(U|X_{1},...,X_{n})$. From this can we conclude that $X=U$ as we have that $\int_{A} (X-U)\,dP=0$ for all $A\in \sigma(X_{1},...,X_{n})$. I am very unsure about this as I am not using any property of $X_{1},...,X_{n}$ let alone the fact that conditioned on $U$, they are iid Bernoulli$(U)$ variates.
Since $X_{n}$ are iid, I tried to think of using Kolmogorov's Zero-One law but I couldn't see how that could help.
Can anyone provide a hint or help me with this?
The $\sigma$-algebra generated by $X_1,\dots,X_n$ is generated by a partition hence in order to compute $\mathbb E\left[U\mid\sigma(X_1,\dots,X_n)\right]$, it suffices to determine for each $\delta_1,\dots,\delta_n\in\{0,1\}$, the quantities $$\tag{(*)} \mathbb E\left[U\mathbb{1}_{(X_1,\dots,X_n)=(\delta_1,\dots,\delta_n)}\right]\mbox{ and }\mathbb P\left((X_1,\dots,X_n)=(\delta_1,\dots,\delta_n)\right). $$ To do so, we use conditioning with respect to $U$ and the assumption, the result will be expressed in terms of $\ell(\delta)=\sum_{i=1}^n\delta_i$ and $n$. By assumption $$ \mathbb E\left[\mathbb{1}_{(X_1,\dots,X_n)=(\delta_1,\dots,\delta_n)}\mid U\right]=U^{\ell(\delta)}(1-U)^{n-\ell(\delta)}. $$ Integrate this to get the second part of (*). For the first one, multiply by $U$, put $U$ inside the conditional expectation and take the expectation. You will get that $$ \mathbb E\left[U\mid\sigma(X_1,\dots,X_n)\right]=\sum_{(\delta_i)_{i=1}^n\in\{0,1\}^n} c_{n,(\delta_i)_{i=1}^n}\mathbf{1}_{(X_i)_{i=1}^n=(\delta_i)_{i=1}^n}. $$
In order to show that $\mathbb E\left[U\mid\sigma(X_1,\dots,X_n)\right]$ converges to $U$ almost surely, we use the fact that if $\mathcal G$ is a $\sigma$-algebra and $U$ and $\mathbb E[U\mid\mathcal G]$ has the same distribution, then $U=\mathbb E[U\mid\mathcal G]$ almost surely.
Once the distribution of $\mathbb E\left[U\mid\sigma(X_1,\dots,X_n)\right]$ is determined, it suffices to check that it converges in distribution to a uniform random variable.