By the standard Weierstrass theorem, if $f:[0,1]\to R$ is continuous then $$\sum_{j=0}^n f(j/n)\binom{n}{j}x^j(1-x)^{n-j}$$ converges to $f(x)$ uniformly for $0\le x\le 1$. I am wondering if there was an elementary to prove an easier multivariate generalization of this. That is $$\sum_{0\le j_1,...,j_k\le n} f\left(\frac{j_1}{n},...,\frac{j_k}{n}\right)\prod_{i=1}^k\binom{n}{j_i}x_i^{j_i}(1-x_i)^{n-j_i} \to f(x_1,x_2,...,x_k) $$ uniformly. Note that the true generalization of this would have $n_1,n_2,...n_k$ rather than just one $n$. As such the proof of which gets very complicated. Is there any easy proof for the case where we have the same $n$ for our binomial coefficients?
2026-02-23 13:39:24.1771853964
Weierstrass Approximation generalization
90 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in REAL-ANALYSIS
- how is my proof on equinumerous sets
- Finding radius of convergence $\sum _{n=0}^{}(2+(-1)^n)^nz^n$
- Optimization - If the sum of objective functions are similar, will sum of argmax's be similar
- On sufficient condition for pre-compactness "in measure"(i.e. in Young measure space)
- Justify an approximation of $\sum_{n=1}^\infty G_n/\binom{\frac{n}{2}+\frac{1}{2}}{\frac{n}{2}}$, where $G_n$ denotes the Gregory coefficients
- Calculating the radius of convergence for $\sum _{n=1}^{\infty}\frac{\left(\sqrt{ n^2+n}-\sqrt{n^2+1}\right)^n}{n^2}z^n$
- Is this relating to continuous functions conjecture correct?
- What are the functions satisfying $f\left(2\sum_{i=0}^{\infty}\frac{a_i}{3^i}\right)=\sum_{i=0}^{\infty}\frac{a_i}{2^i}$
- Absolutely continuous functions are dense in $L^1$
- A particular exercise on convergence of recursive sequence
Related Questions in PROBABILITY-THEORY
- Is this a commonly known paradox?
- What's $P(A_1\cap A_2\cap A_3\cap A_4) $?
- Another application of the Central Limit Theorem
- proving Kochen-Stone lemma...
- Is there a contradiction in coin toss of expected / actual results?
- Sample each point with flipping coin, what is the average?
- Random variables coincide
- Reference request for a lemma on the expected value of Hermitian polynomials of Gaussian random variables.
- Determine the marginal distributions of $(T_1, T_2)$
- Convergence in distribution of a discretized random variable and generated sigma-algebras
Related Questions in PARTIAL-DIFFERENTIAL-EQUATIONS
- PDE Separation of Variables Generality
- Partial Derivative vs Total Derivative: Function depending Implicitly and Explicitly on Variable
- Transition from theory of PDEs to applied analysis and industrial problems and models with PDEs
- Harmonic Functions are Analytic Evan’s Proof
- If $A$ generates the $C_0$-semigroup $\{T_t;t\ge0\}$, then $Au=f \Rightarrow u=-\int_0^\infty T_t f dt$?
- Regular surfaces with boundary and $C^1$ domains
- How might we express a second order PDE as a system of first order PDE's?
- Inhomogeneous biharmonic equation on $\mathbb{R}^d$
- PDE: Determine the region above the $x$-axis for which there is a classical solution.
- Division in differential equations when the dividing function is equal to $0$
Related Questions in WEIERSTRASS-APPROXIMATION
- Understanding the Stone-Weierstrass Theorem in Rudin's Principle of Mathematical Analysis
- Rudin proof change, 7.27.
- Polynomial Approximation gives zero function
- Weierstrass Approximation Theorem on $\frac{1}{x}$
- Vector-valued Weierstrass theorem
- Limit Point 2D Bolzano-Weierstrass
- Convergence rate of multivariate polynomials.
- Proving existence and uniquennes of integral
- Prove Weierstrass Theorem using Fejer Theorem
- Approximation by smooth functions by changing values at arbitrarily small interval
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let $f$ be continuous on $[0,1]^k$ (hence uniformly continuous). Define (for fixed $n$ and $x=(x_1,...,x_k)$) $$B_n(f)(x) = \sum_{0 \le j_1,j_2,...,j_k \le n} f(\frac{j_1}{n},...,\frac{j_k}{n}) \prod_{i=1}^k {n \choose j_i}x_i^{j_i}(1-x_i)^{n-j_i} $$
Take $S_{n,i}(x) = \sum_{j=1}^n X_{j,i}(x)$, where $X_{j,i} \sim \mathcal B(1,x_i)$ (so that $\mathbb P(X_{j,i}(x) = 1) = x_i = 1-\mathbb P(X_{j,i}(x)=0)$) and $\{X_{j,i}(x)\}_{\{j \in \mathbb N, i \in \{1,..,k\}\}}$ are independent (when $x$ is fixed). Let $S_n(x) = (S_{n,1}(x),...,S_{n,k}(x))$. In such terminology, we can rewrite:
$$B_n(f)(x) = \mathbb E[f(\frac{S_{n}(x)}{n})]$$
Now, we want to bound $$ |B_n(f)(x) - f(x)| = |\mathbb E[ f(\frac{S_{n}(x)}{n}) - f(x)]| \le \mathbb E[ |f(\frac{S_{n}(x)}{n}) - f(x)|]$$
Fix $\varepsilon > 0$. Now, write $1 = 1_{\{||\frac{S_n(x)}{n} - x|| > \delta\}} + 1_{\{||\frac{S_n(x)}{n} - x || \le \delta\}}$ under expectation, getting (by triangle inequality):
$$ |B_n(f)(x) - f(x)| \le \mathbb E[|f(\frac{S_{n}(x)}{n}) - f(x)|1_{\{||\frac{S_n(x)}{n} - x|| > \delta\}}] + \mathbb E[|f(\frac{S_{n}(x)}{n}) - f(x)|1_{\{||\frac{S_n(x)}{n} - x|| \le \delta\}}] $$
Since $f$ is continuous on $[0,1]^k$ it is bounded, let's say by $M$. Moreover, by uniform continuity, for any $\varepsilon > 0$ we can find $\delta > 0$ such that $|f(x)-f(y)| < \varepsilon$ if $||x-y|| < \delta$ (and take that $\delta$ above). Hence getting:
$$ |B_n(f)(x) - f(x)| \le 2M \mathbb P(||\frac{S_n(x)}{n} - x|| > \delta) + \varepsilon \mathbb P(||\frac{S_n(x)}{n} - x|| \le \delta) $$
We're almost done. The second probability is bounded by $1$. We only need to bound the first one (note that for $x$ fixed it's obvious by weak law or anything like that, but we need the bound independent of $x$ ). Note that:
$$ \mathbb P(||\frac{S_n(x)}{n} - x|| > \delta) \le \mathbb P( \{|\frac{S_{n,1}(x)}{n} - x_1| > \delta\} \cup ... \cup \{|\frac{S_{n,k}(x)}{n} - x_k| > \delta\}) \le \sum_{i=1}^k \mathbb P(|\frac{S_{n,i}(x)}{n} - x_i| > \delta)$$
Now, since $\mathbb E[\frac{S_{n,i}(x)}{n}] = \frac{nx_i}{n} = x_i$ we can apply Chebyshev-Bienayme inequality to get:
$$ \mathbb P(|\frac{S_{n,i}(x)}{n} - x_i| > \delta) \le \frac{Var(S_{n,i}(x))}{n^2\delta^2} = \frac{x_i(1-x_i)}{\delta^2 n} \le \frac{1}{4\delta^2 n} $$
Hence:
$$ |B_n(f)(x) - f(x)| \le 2M \frac{k}{4\delta^2 n} + \varepsilon $$ which is a bound independent of $x \in [0,1]^k$.
Note that this proof is really the same as proof of Bernstein approximation (via probability theory) for functions on $[0,1]$ (one dimensional). Just notation changed to $|| \cdot ||$ instead of $| \cdot |$.
As for the part with $(n_1,...,n_k) \to (\infty,...,\infty)$. Let for fixed $n=(n_1,...,n_k)$ denote (again) $S_n(x) = (S_{n_1,1}(x),...,S_{n_k,k}(x))$ and by $\frac{S_n(x)}{n} = (\frac{S_{n_1,1}(x)}{n_1}, ... ,\frac{S_{n_k,k}(x)}{n_k})$ (and $B_n(f)(x)$ similarly) Trying to do the same, we get:
$$ |B_n(f)(x) - f(x)| \le \mathbb E[|f(\frac{S_{n}(x)}{n}) - f(x)|1_{\{||\frac{S_n(x)}{n} - x|| > \delta\}}] + \mathbb E[|f(\frac{S_{n}(x)}{n}) - f(x)|1_{\{||\frac{S_n(x)}{n} - x|| \le \delta\}}] $$
Again, fix $\varepsilon >0$. On the first one we have just $f$ bounded by some $M$, on the second via uniform continuity we get appropriate $\delta>0$ (and we want to choose that $\delta$) so that:
$$ |B_n(f)(x) - f(x)| \le 2M \mathbb P(||\frac{S_n(x)}{n} - x|| > \delta) + \varepsilon $$
As to bound first one, similarly:
$$ P(||\frac{S_n(x)}{n} - x|| > \delta) \le \sum_{i=1}^k \mathbb P(\frac{S_{n_i,i}(x)}{n_i} - x_i| > \delta) \le \sum_{i=1}^k \frac{1}{4\delta^2n_i} \le \frac{k}{4\delta^2} \cdot \frac{1}{\min(n_i : i \in \{1,..,k\})}$$
Hence the bound (independent of $x$):
$$ |B_n(f)(x) - f(x)| \le 2M \frac{k}{4\delta^2} \frac{1}{\min(n_i : i \in \{1,...,k\})} + \varepsilon $$
But as $(n_1,...,n_k) \to (\infty,...,\infty)$ we get $\min(n_i : i \in \{1,...,k\}) \to \infty$, too.