Weierstrass Approximation generalization

90 Views Asked by At

By the standard Weierstrass theorem, if $f:[0,1]\to R$ is continuous then $$\sum_{j=0}^n f(j/n)\binom{n}{j}x^j(1-x)^{n-j}$$ converges to $f(x)$ uniformly for $0\le x\le 1$. I am wondering if there was an elementary to prove an easier multivariate generalization of this. That is $$\sum_{0\le j_1,...,j_k\le n} f\left(\frac{j_1}{n},...,\frac{j_k}{n}\right)\prod_{i=1}^k\binom{n}{j_i}x_i^{j_i}(1-x_i)^{n-j_i} \to f(x_1,x_2,...,x_k) $$ uniformly. Note that the true generalization of this would have $n_1,n_2,...n_k$ rather than just one $n$. As such the proof of which gets very complicated. Is there any easy proof for the case where we have the same $n$ for our binomial coefficients?

1

There are 1 best solutions below

2
On BEST ANSWER

Let $f$ be continuous on $[0,1]^k$ (hence uniformly continuous). Define (for fixed $n$ and $x=(x_1,...,x_k)$) $$B_n(f)(x) = \sum_{0 \le j_1,j_2,...,j_k \le n} f(\frac{j_1}{n},...,\frac{j_k}{n}) \prod_{i=1}^k {n \choose j_i}x_i^{j_i}(1-x_i)^{n-j_i} $$

Take $S_{n,i}(x) = \sum_{j=1}^n X_{j,i}(x)$, where $X_{j,i} \sim \mathcal B(1,x_i)$ (so that $\mathbb P(X_{j,i}(x) = 1) = x_i = 1-\mathbb P(X_{j,i}(x)=0)$) and $\{X_{j,i}(x)\}_{\{j \in \mathbb N, i \in \{1,..,k\}\}}$ are independent (when $x$ is fixed). Let $S_n(x) = (S_{n,1}(x),...,S_{n,k}(x))$. In such terminology, we can rewrite:

$$B_n(f)(x) = \mathbb E[f(\frac{S_{n}(x)}{n})]$$

Now, we want to bound $$ |B_n(f)(x) - f(x)| = |\mathbb E[ f(\frac{S_{n}(x)}{n}) - f(x)]| \le \mathbb E[ |f(\frac{S_{n}(x)}{n}) - f(x)|]$$

Fix $\varepsilon > 0$. Now, write $1 = 1_{\{||\frac{S_n(x)}{n} - x|| > \delta\}} + 1_{\{||\frac{S_n(x)}{n} - x || \le \delta\}}$ under expectation, getting (by triangle inequality):

$$ |B_n(f)(x) - f(x)| \le \mathbb E[|f(\frac{S_{n}(x)}{n}) - f(x)|1_{\{||\frac{S_n(x)}{n} - x|| > \delta\}}] + \mathbb E[|f(\frac{S_{n}(x)}{n}) - f(x)|1_{\{||\frac{S_n(x)}{n} - x|| \le \delta\}}] $$

Since $f$ is continuous on $[0,1]^k$ it is bounded, let's say by $M$. Moreover, by uniform continuity, for any $\varepsilon > 0$ we can find $\delta > 0$ such that $|f(x)-f(y)| < \varepsilon$ if $||x-y|| < \delta$ (and take that $\delta$ above). Hence getting:

$$ |B_n(f)(x) - f(x)| \le 2M \mathbb P(||\frac{S_n(x)}{n} - x|| > \delta) + \varepsilon \mathbb P(||\frac{S_n(x)}{n} - x|| \le \delta) $$

We're almost done. The second probability is bounded by $1$. We only need to bound the first one (note that for $x$ fixed it's obvious by weak law or anything like that, but we need the bound independent of $x$ ). Note that:

$$ \mathbb P(||\frac{S_n(x)}{n} - x|| > \delta) \le \mathbb P( \{|\frac{S_{n,1}(x)}{n} - x_1| > \delta\} \cup ... \cup \{|\frac{S_{n,k}(x)}{n} - x_k| > \delta\}) \le \sum_{i=1}^k \mathbb P(|\frac{S_{n,i}(x)}{n} - x_i| > \delta)$$

Now, since $\mathbb E[\frac{S_{n,i}(x)}{n}] = \frac{nx_i}{n} = x_i$ we can apply Chebyshev-Bienayme inequality to get:

$$ \mathbb P(|\frac{S_{n,i}(x)}{n} - x_i| > \delta) \le \frac{Var(S_{n,i}(x))}{n^2\delta^2} = \frac{x_i(1-x_i)}{\delta^2 n} \le \frac{1}{4\delta^2 n} $$

Hence:

$$ |B_n(f)(x) - f(x)| \le 2M \frac{k}{4\delta^2 n} + \varepsilon $$ which is a bound independent of $x \in [0,1]^k$.

Note that this proof is really the same as proof of Bernstein approximation (via probability theory) for functions on $[0,1]$ (one dimensional). Just notation changed to $|| \cdot ||$ instead of $| \cdot |$.

As for the part with $(n_1,...,n_k) \to (\infty,...,\infty)$. Let for fixed $n=(n_1,...,n_k)$ denote (again) $S_n(x) = (S_{n_1,1}(x),...,S_{n_k,k}(x))$ and by $\frac{S_n(x)}{n} = (\frac{S_{n_1,1}(x)}{n_1}, ... ,\frac{S_{n_k,k}(x)}{n_k})$ (and $B_n(f)(x)$ similarly) Trying to do the same, we get:

$$ |B_n(f)(x) - f(x)| \le \mathbb E[|f(\frac{S_{n}(x)}{n}) - f(x)|1_{\{||\frac{S_n(x)}{n} - x|| > \delta\}}] + \mathbb E[|f(\frac{S_{n}(x)}{n}) - f(x)|1_{\{||\frac{S_n(x)}{n} - x|| \le \delta\}}] $$

Again, fix $\varepsilon >0$. On the first one we have just $f$ bounded by some $M$, on the second via uniform continuity we get appropriate $\delta>0$ (and we want to choose that $\delta$) so that:

$$ |B_n(f)(x) - f(x)| \le 2M \mathbb P(||\frac{S_n(x)}{n} - x|| > \delta) + \varepsilon $$

As to bound first one, similarly:

$$ P(||\frac{S_n(x)}{n} - x|| > \delta) \le \sum_{i=1}^k \mathbb P(\frac{S_{n_i,i}(x)}{n_i} - x_i| > \delta) \le \sum_{i=1}^k \frac{1}{4\delta^2n_i} \le \frac{k}{4\delta^2} \cdot \frac{1}{\min(n_i : i \in \{1,..,k\})}$$

Hence the bound (independent of $x$):

$$ |B_n(f)(x) - f(x)| \le 2M \frac{k}{4\delta^2} \frac{1}{\min(n_i : i \in \{1,...,k\})} + \varepsilon $$

But as $(n_1,...,n_k) \to (\infty,...,\infty)$ we get $\min(n_i : i \in \{1,...,k\}) \to \infty$, too.