While studying the Weak Law of Large Numbers on Allan Gut's Graduate Probability course I came across the proof of Weierstrass approximation theorem.
Before the proof of uniform convergence he sets $X_1,...,X_n$ indepent, identically distributed Bernoulli random variables with $P[X=1] = x$ and $P[X=0] = 0$. Finally he sets $Y_n=\frac{1}{n} \sum_{k=1}^{n} X_k$ and states that $Eu(Y_n)=u_n(x)$ where u is the continuous function the theorem deals with and $u_n$ is the Bernstein polynomial of degree n
The problem comes when he cites theorems 5.10.2 and 5.5.4 (which aren't actually on the book or at least I haven't seen them) and states that $u(Y_n)\xrightarrow{\rm{P}} u(x)$ (which I understand) and $u_n(x) = Eu(Y_n) \xrightarrow{} u(x)$.
Why is that last statement true?
I don't know what the author wanted to say, but one possibility is the dominated convergence theorem. You know that $$ Y_n(\omega)\to x, \qquad \omega-\text{a.e.}, $$ (or if you prefer, the convergence is almost sure with respect to the probability measure). Since $u$ is continuous one has $$ u\left( Y_n(\omega) \right) \to u(x)\, \qquad \omega-\text{a.e.} $$ and $u$, being continuous on $[0,1]$ (I suppose - you should specify such things in your question), is also bounded, so the sequence $u\left( Y_n(\omega)\right)$ is dominated by a constant which is $L^1$ with respect to any probability measure.