Showing Bernstein polynomial is a basis

2.4k Views Asked by At

Hello I want to show that the Bernstein polynomial $$B_{n,k}=\binom{n}{k}x^k(1-x)^{n-k}\,$$ is a basis. For linear independence I got a hint from my teacher to expand the binom $(1-x)^{n-k}$ This way I get: $$B_{n,k}=\binom{n}{k}x^k\sum_{j=0}^{n-k}\binom{n-k}{j}(-1)^jx^j$$ And changing the index of summation gives: $$B_{n,k}=\sum_{j=k}^{n}\binom{n-k}{j-k}\binom{n}{k}(-1)^{j-k}x^{j-k+k}=\sum_{j=k}^{n}\binom{n}{j}\binom{j}{k}(-1)^{j-k}x^j$$ Now I have to show that $\alpha_i$ are $0$ in the relation $\sum_{i=0}^{n}\alpha_iB_{i,n}=0\,$ or$$\alpha_0\sum_{j=0}^n(-1)^j\binom{n}{j}\binom{j}{0}x^j+\alpha_1\sum_{j=1}^n(-1)^{j-1}\binom{n}{j}\binom{j}{1}x^j+...+\alpha_{n}\sum_{j=n}^n(-1)^{j-n}\binom{n}{j}\binom{j}{n}x^j=0$$ Now what can I do and how can I finish this problem? Thanks in advance!

4

There are 4 best solutions below

1
On BEST ANSWER

When you expand them, you see that only one of the Bernstein polynomials has a non-zero constant term, namely $\binom nn(1-x)^n$. So, if$$\sum_{k=0}^n\alpha_kB_{n,k}(x)=0,\tag1$$then $\alpha_0=0$.

Now, there are only two Bernstein polynomials such that the coefficient of $x$ is non-zero, which are $B_{n,0}(x)$ and $B_{n,1}(x)$. But you already know that $\alpha_0=0$. It follows then from $(1)$ that $\alpha_1=0$.

And so on…

0
On

Hint: The matrix that expresses the Bernstein polynomials with respect to the canonical monomial basis is triangular with a diagonal of binomial coefficients, and so is invertible.

For instance, when $n=3$, we have $$ \begin{pmatrix} B_{3,0}(x) \\ B_{3,1}(x)\\ B_{3,2}(x) \\ B_{3,3}(x) \end{pmatrix} = \begin{pmatrix} (1-x)^3 \\ 3x(1-x)^2 \\ 3x^2(1-x) \\ x^3 \end{pmatrix} = \begin{pmatrix} 1 & -3 & \hphantom{-}3 & \hphantom{-}1 \\ 0 & \hphantom{-}3 & -6 & \hphantom{-}3 \\ 0 & \hphantom{-}0 & \hphantom{-}3 & -3 \\ 0 & \hphantom{-}0 & \hphantom{-}0 & \hphantom{-}1 \\ \end{pmatrix} \begin{pmatrix} 1 \\ x \\ x^2 \\ x^3 \end{pmatrix} $$ The exact entries in the matrix are not important. The key point is that the $x^k$ factor in $B_{n,k}(x)$ ensures that in the $k$-th row all entries before the diagonal are zero, and so the matrix is triangular.

1
On

First of all, note that we have power series and Bernstein polynomials

Also if there exist constants $c_0, c_1, ..., c_n$ so that the identity $0 = c_0*B_{0,n}(t) + c_1*B_{1,n}(t) + · · · + c_n*B_{n,n}(t)$ holds for all t, then all the $c_i$’s must be zero and we have Indecency Since the power basis is a linearly independent set, we must have that Zero coefficient which implies that $c_0 = c_1 = · · · = c_n = 0$ ($c_0$ is clearly zero, substituting this in the second equation gives $c_1 = 0$, substituting these two into the third equation gives ...)

0
On

We can use induction and the derivative formula of the Bernstein Polynomials. More precisely: for $n=0$ the statement is true. For $n>0$, let $p=\sum_{k=0}^n \alpha_k B_{n,x}$ be the zero polynomial. Then also $p'$ is the zero polynomial. By the derivative formula (and induction hypothesis) we have $\alpha_k=\alpha_{k-1}$ for all $1\le k\le n$. But clearly $\alpha_0=p(0)=0$, so $\alpha_k=0$ for all $0\le k\le n$.