Explicitly Showing Linear Transformation of space of polynomials

50 Views Asked by At

Let E be the space of all polynomials $f(\epsilon) = c_0 + c_1\epsilon + \dots + c_{n-1} \epsilon^{n-1}$ of degree $\leq (n-1)$ (some fixed n; the coefficients $c_j$ can be taken either real or complex). The derivative $Df(\epsilon) = f'(\epsilon)$ defines a linear transformation $D$ of $E$. Want to show: $$exp(\tau D)f(\epsilon) = f(\epsilon + \tau)$$ $expX$ is defined as the exponential of a matrix X such that: $$expX = \sum_{k=0}^\infty\frac{1}{k!}X^k$$ I began thinking about trying to represent the basis of E, but had trouble conceptualizing the space E itself. Is it supposed to consist of all possible combinations of constants and epsilons? Why couldn't I just represent the basis as the multiplication of $n-1$ columns of $c_0 \dots c_n$ and $n-1$ rows of $\epsilon_0 \dots \epsilon$ . Of course, the derivative of $f(\epsilon)$ is $f'(\epsilon) = c_1 + 2c_2 + \dots + (n-1)c_{n-1}\epsilon^{n-2}$, but I am also having trouble establishing the relationship between the two functions--especially when I try to play with their respective bases.

1

There are 1 best solutions below

0
On BEST ANSWER

The space $E$ consists of all polynomial functions of degree $\leq n - 1$. This is a $n$-dimensional space for which (one choice of) a basis is given by the polynomial functions $(1,\varepsilon, \dots, \varepsilon^{n - 1})$. It is possible to prove your exercise by representing $D$ as a matrix, calculating $\exp(\tau D)$ and verifying the equality but in this case, it is actually easier to procede directly. Note that $D$ is a nilpotent operator because $D^n = 0$ (the $n$-th derivative of a polynomial of degree $\leq n - 1$ is zero). Let us take the polynomial $f = \varepsilon^i$ and calculate $(\exp(\tau D)f)(\varepsilon)$:

$$ (\exp(\tau D)f)(\varepsilon) = \left( \left( \sum_{k=0}^n \frac{(\tau D)^k}{k!} \right) f \right)(\varepsilon) = \sum_{k = 0}^n \frac{\tau^k}{k!} f^{(k)}(\varepsilon) = \sum_{k=0}^i { i \choose k} \tau^k \varepsilon^{i - k} = (\varepsilon + \tau)^i = f(\varepsilon + \tau). $$

Hence, the result is true for $f = \varepsilon^i$. However, both sides of the equality are linear in $f$ and hence the result is true for all the polynomials of degree $\leq n - 1$.

Alternatively, note that $\sum_{k = 0}^n \frac{\tau^k}{k!} f^{(k)}(\varepsilon)$ is just the formula for the taylor expansion of the polynomial $f(\varepsilon + \tau)$ (treating $\tau$ as the variable and $\varepsilon$ as constant) around $\tau = 0$ and since we are talking about polynomials, it converges to $f(\varepsilon + \tau)$.