The vanishing of the $\bf B$s

266 Views Asked by At

Define $B_0=1$ and recursively $$\tag 1 \sum_{k=0}^{n}\binom {n+1}k B_k=[n=0]$$

How can I prove $B_{2n+1}=0$ for $n\geqslant 1$ using this definition?

Note the above means that $$\sum_{k=0}^{n}\binom {n}k B_k=B_n+[n=1]$$

but of course we should use $(1)$ to recursively compute them.

3

There are 3 best solutions below

0
On BEST ANSWER

Suppose we have $$ \sum_{k=0}^n\binom{n+1}{k}B_k=[n=0]\tag{1} $$ Summing $(1)$ times $(-1)^n\binom{m}{n+1}$ yields $$ \begin{align} m &=\sum_{n=0}^m\sum_{k=0}^n(-1)^n\binom{m}{n+1}\binom{n+1}{k}B_k\\ &=\sum_{k=0}^m\sum_{n=k+1}^m(-1)^{n-1}\binom{m}{n}\binom{n}{k}B_k\\ &=\sum_{k=0}^m\sum_{n=k+1}^m(-1)^{n-1}\binom{m}{k}\binom{m-k}{n-k}B_k\\ &=\sum_{k=0}^{m-1}(-1)^k\binom{m}{k}B_k\tag{2} \end{align} $$ Setting $m=n+1$ in $(2)$ and subtracting $(1)$ gives $$ \begin{align} n+1-[n=0] &=\sum_{k=0}^n((-1)^k-1)\binom{n+1}{k}B_k\\ &=-2\sum_{k=0}^{\large\lfloor\frac{n-1}{2}\rfloor}\binom{n+1}{2k+1}B_{2k+1}\tag{3} \end{align} $$ Subtracting the $k=0$ term from both sides and substituting $n\mapsto2n+1$ results in $$ \sum_{k=1}^n\binom{2n+2}{2k+1}B_{2k+1}=0\tag{4} $$ which inductively shows that $B_{2k+1}=0$ for $k\ge1$.

13
On

Let. $n$ be an odd number: $$\sum_{k=0}^n\binom {n+1}k B_k=0$$ gives us $n$ linear equations.

$$\begin{align} \sum\limits_{k = 1}^n \binom{n+1}k{B_k} &= - 1 \cr \sum\limits_{k = 1}^{n - 1} \binom nk {B_k} &= - 1 \cr \sum\limits_{k = 1}^{n - 2} \binom{n-1}k {B_k} &= - 1 \cr \cdots &= \cdots \cr \sum\limits_{k = 1}^2 \binom 3k {B_k} &= - 1\cr \sum\limits_{k = 1}^1 \binom 2k {B_k} &= - 1 \end{align} $$

Corurtsy for the matrix formulation: Robjohn

System these linear equation gives us,$$\textstyle \begin{bmatrix} \binom{n+1}{n}&\cdots&\binom{n+1}{3}&\binom{n+1}{2}&\binom{n+1}{1}\\ \vdots&\ddots&\vdots&\vdots&\vdots\\ 0&\cdots&\binom43&\binom42&\binom41\\ 0&\cdots&0&\binom32&\binom31\\ 0&\cdots&0&0&\binom21 \end{bmatrix} \begin{bmatrix} \vphantom{\binom11}B_n\\ \vdots\\ \vphantom{\binom11}B_3\\ \vphantom{\binom11}B_2\\ \vphantom{\binom11}B_1 \end{bmatrix} = \begin{bmatrix} \vphantom{\binom11}-1\\ \vdots\\ \vphantom{\binom11}-1\\ \vphantom{\binom11}-1\\ \vphantom{\binom11}-1 \end{bmatrix}$$

Now if one applies Cramar's rule to solve $B_n$, will have $B_n=\frac{\det A_n}{\det D_n}$. $D_n$ is the above matrix. $A_n$ is same as $D_n$ with $1$'st column is $(-1, -1,...,-1)^t$. Just do elementary row operation, you will get $\det A_n =0$

I think that, $(1)$ would be easier to calculate $B_n$ as the determinant, however $(2)$ is also giving 'same type' matrix. If you just check, you would see that after killing $-1$ in first column automatically $A_n$ becomes an upper triangular matrix with one diagonal element $0$. So determinant vanishes.

0
On

After RobJohn's solution there is no more answer needed; but I've seen now the source of the little understanding in my previous answer (but which has several comments so I hesitate to retract/delete it; please leave a comment if you prefer that answer to be deleted).
Here is an improvement/correction for the matrix notation which I wanted to arrive at.


The recursive binomial identity (1) in the OP is $$ \bf P^* \cdot \bf B = \bf I_0 $$ or in explicite matrix-notation $$ \small \begin{bmatrix} 1 & . & . & . & . & . & . & . \\ 1 & 2 & . & . & . & . & . & . \\ 1 & 3 & 3 & . & . & . & . & . \\ 1 & 4 & 6 & 4 & . & . & . & . \\ 1 & 5 & 10 & 10 & 5 & . & . & . \\ 1 & 6 & 15 & 20 & 15 & 6 & . & . \\ 1 & 7 & 21 & 35 & 35 & 21 & 7 & . \\ 1 & 8 & 28 & 56 & 70 & 56 & 28 & 8 \end{bmatrix} \cdot \begin{bmatrix} 1\\b_1\\b_2\\b_3\\b_4\\b_5\\b_6\\b_7 \end{bmatrix} = \begin{bmatrix} 1\\0\\0\\0\\0\\0\\0\\0 \end{bmatrix}$$ which can be thought to be continued to any dimension. Here the $b_k$ are unknowns to be determined and $b_0=1$ was given by definition. Because $\bf P^*$ is triangular and has nonzero diagonal elements it is invertible and there is a unique solution for $ \bf B$ where the property of being unique is part of the conjecture to be proved.
Now if we invert $ {\bf G} = {\bf P^*} ^{-1}$ we get $$ {\bf G} =\small \begin{bmatrix} 1 & . & . & . & . & . & . & . \\ -1/2 & 1/2 & . & . & . & . & . & . \\ 1/6 & -1/2 & 1/3 & . & . & . & . & . \\ 0 & 1/4 & -1/2 & 1/4 & . & . & . & . \\ -1/30 & 0 & 1/3 & -1/2 & 1/5 & . & . & . \\ 0 & -1/12 & 0 & 5/12 & -1/2 & 1/6 & . & . \\ 1/42 & 0 & -1/6 & 0 & 1/2 & -1/2 & 1/7 & . \\ 0 & 1/12 & 0 & -7/24 & 0 & 7/12 & -1/2 & 1/8 \end{bmatrix}$$ and by $$ {\bf B } = {\bf G } \cdot {\bf I}_0$$ our result for $ \bf B $ is the first column of $\bf G$ where indeed the entries at odd indexes $ \gt 2$ vanish (and being the unique solution this answers the question in part).

Well, this does not suffice as a proof; a complete proof must now show, that the entries at odd indexes $\gt 2$ do indeed vanish, but gives at least the heuristic insight from which then a proof could be derived.