Forward difference of Bernoulli polynomials

291 Views Asked by At

I am working on a research project on Bernoulli polynomials, which are defined as $$B_n(x) = \sum_{j} \binom{n}{j}B_j x^{n-j}$$ where $B_j$ are the Bernoulli numbers. There is also the "generating function" $$\frac{ze^{zx}}{e^z-1} = \sum_n B_n(x)\frac{z^n}{n!}$$ Many papers I have found cite the following property and use it extensively in their proofs: $$\Delta B_n(x) = n x^{n-1}$$ where $\Delta$ is the forward-difference operator from discrete calculus. However, this fact is always stated without proof, and I am wondering what the justification is.

3

There are 3 best solutions below

1
On

One way is to use the generating function: \begin{align} \sum_n \Delta B_n(x) \frac{z^n}{n!} &= \sum_n(B_n(x+1) - B_n(x)) \frac{z^n}{n!} \\ &= \frac{z e^{z(x+1)}}{e^z-1} - \frac{z e^{zx}}{e^z-1} \\ &= z e^{zx} \\ &= \sum_m x^m \frac{z^{m+1}}{m!} \\ &= \sum_n x^{n-1} \frac{z^n}{(n-1)!} \end{align} Then compare coefficients.

0
On

You can think of the space of polynomials as the vector space generated by the monomials $1$, $x$, $x^2$, $x^3$, $\ldots$. The Bernoulli polynomials are a different basis for the same vector space and can be defined by the property that the linear map $L$ that transforms each monomial into the corresponding Bernoulli polynomial, $$ Lx^n=B_n(x), $$ transforms the forward difference of a polynomial into the derivative of that polynomial: $$ L\Delta f(x)=\frac{d}{dx}f(x). $$ (Writing down the system of linear equations relating the forward difference of $f$ to the derivative of $f$ and then solving that system is one way to obtain $L$ and hence the Bernoulli polynomials. Your summation formula for the Bernoulli polynomials and the recurrence for the Bernoulli numbers, $\sum_{k=0}^n\binom{n+1}{k}B_k=\delta_{n,0}$, naturally emerge from the solution.)

Now the property you want to prove has $L$ and $\Delta$ in the opposite order: you want to prove $$ \Delta B_n(x)=\Delta L x^n=\frac{d}{dx} x^n $$ or, more generally, $\Delta L f(x)=\frac{d}{dx}f(x)$.

In fact, the three operators $L$, $\Delta$, and $\frac{d}{dx}$, which all act on the space of polynomials, are mutually commuting, $$ \frac{d}{dx}\Delta f(x)=\Delta\frac{d}{dx}f(x),\quad \frac{d}{dx}Lf(x)=L\frac{d}{dx}f(x),\quad \Delta Lf(x)=L\Delta f(x). $$ The first of these follows from the chain rule; the second can be checked by verifying from your definition that $B'_n(x)=nB_{n-1}(x)$, which is the rule in the case $f(x)=x^n$, and using linearity of $L$ and $\frac{d}{dx}$. The third one, which is the one you want to prove, then follows from the second: apply $L\Delta f(x)=\frac{d}{dx}f(x)$ to the polynomial $Lf(x)$ to obtain $L\Delta Lf(x)=\frac{d}{dx}Lf(x)=L\frac{d}{dx}f(x)$. Since $L$ is an invertible operator, it follows that $\Delta Lf(x)=\frac{d}{dx}f(x)$.

By the way, the operators $\frac{d}{dx}$ and $\Delta$ are not invertible, a fact that amounts to the observation that the antiderivative and the anti-forward-difference are not uniquely defined.

Added: This expands on the parenthetical remark in the first paragraph. Let $f(x)=\sum_{i=0}^n f_ix^i$ be a polynomial of degree $n$ and let $f'(x)=\sum_{i=0}^{n-1}g_ix^i$, $\Delta f(x)=\sum_{i=0}^{n-1}h_ix^i$. Comparing the first and last lines of \begin{align} \sum_{i=0}^{n-1}h_i x^i &= \Delta f(x)\\ &=f(x+1)-f(x)\\ &=\sum_{i=0}^nf_i((x+1)^i-x^i)\\ &=\sum_{i=1}^nf_i\sum_{j=0}^{i-1}\binom{i}{j}x^j\\ &=\sum_{i=1}^n if_i\sum_{j=0}^{i-1}\frac{1}{i}\binom{i}{j}x^j\\ &=\sum_{i=0}^{n-1} g_i\sum_{j=0}^{i}\frac{1}{i+1}\binom{i+1}{j}x^j \end{align} gives a system of linear equations relating the $h_i$ to the $g_i$. The coefficient matrix is upper triangular, so its inverse is found by straightforward iteration. Carrying out the inversion gives $$ \sum_{i=0}^{n-1}g_ix^i=\sum_{i=0}^{n-1}h_iB_i(x). $$ This is where the prescription "replace every power $x^i$ in $\Delta f(x)$ with $B_i(x)$ to obtain $f'(x)$" comes from. Taking this linear algebra problem as the definition of the Bernoulli polynomials allows a great many things to be easily explained, in my view.

0
On

To calculate

$$ \sum_{k=0}^{m-1} f(x+k) $$

one finds a function $g(x)$ such that

$$\Delta g(x) \colon = g(x+1) - g(x) = f(x)$$

and then

$$g(x+m) - g(x) = \sum_{k=0}^{m-1} f(x+k)$$

There is a formal procedure to find $g$ that provides a solution when $f$ is a polynomial. Say we want to solve

$$\Delta g(x) = f(x) = D h(x)$$

where $D$ is the usual derivative. Note that

$$\Delta g = (e^D-1) g$$

and so we have to solve

$$(e^D-1) g = D h$$ with a solution

$$g = \frac{D}{e^D-1} h$$

The Bernoulli polynomials are solutions of

$$B_n(x+1) - B_n(x) = n x^{n-1} = (x^n)'$$

so we take

$$B_n(x) = \frac{D}{e^D-1} x^n$$

Define the Bernoulli numbers to be the coefficients in the expansion

$$\frac{D}{e^D-1} = \sum_{k=0}^{\infty} \frac{B_k}{k!} D^k$$

and we get

$$B_n(x) = \sum_{k\ge 0} \frac{B_k}{k!} D^k(x^n)= \sum_{k=0 }^n\binom{n}{k} B_k x^{n-k}$$

From the above we conclude that

$$\frac{t}{e^t-1} \cdot e^{x t} = \sum_{n \ge 0} \frac{B_n(x)}{n!} t^n$$

$\bf{Added:}$ From $B_n(x) = \frac{D}{e^D-1} x^n$ we conclude that

$$B_n'(x) = n B_{n-1}(x)$$ for $n \ge 0$