Could anyone explain why this is a general case of Weierstrass Approximation?

68 Views Asked by At

Suppose $X_1, X_2 ...$ are independent Bernoulli random variables. with probability $p$ and $1-p$. Let $\bar{X}_n = \frac{1}{n} \sum\limits_{i=1}^nX_i$.

If $U \in C^0([0,1],\mathbb{R})$, then $E(U(\bar{X_n}))$ converges uniformly to $ U(E(\bar{X_n})) = U(p).$

Our professor used this as a "general case of Weierstrass approximation" and it was proved. But I still don't understand very much.

What is the polynomial here?

What is the meaning of $E(U(\bar{X_n}))$ converges uniformly to $ U(E(\bar{X_n}) = U(p)?$

1

There are 1 best solutions below

0
On BEST ANSWER

The polynomial here is $E(U(\bar X_n))$. Writing it out, this is $$ E(U(\bar X_n))=\sum_{k=0}^n U(k/n)P(\bar X_n=k/n)\tag1$$ since $\bar X_n$ takes values in $0/n, 1/n,\ldots k/n$. If we write $$P(\bar X_n=k/n)=P(n\bar X_n=k)$$ we see that $n\bar X_n=\sum_i X_i$ is a binomial($n,p$) random variable, so the sum (1) becomes the polynomial $$\sum_{k=0}^n U(k/n){n\choose k}p^k(1-p)^{n-k}.\tag2$$ Note that the coefficients $U(k/n)$ are constants. The convergence is uniform in $p$, in the sense $$\sup_{p\in[0,1]} \left | E(U(\bar X_n))-U(p) \right | \to 0\ \text{as $n\to\infty$.} $$ The remark your prof made is that (2) is an explicit construction of the $n$th approximating polynomial (the Bernstein polynomial) in the Weierstrass theorem.