Proving that a linear transformation is diagonalisable

595 Views Asked by At

Given that $V = \mathbb{R}[X]_{\leq 2}$ and $\alpha \in \mathbb{R}$ prove that the linear transformation $L: V \to V$ given by $L(P(X)) = \alpha P(X) + (X+1)P'(X)$ is diagonalisable and determine the matrix representation with respect to a basis of eigenvectors.

I know that $L$ is diagonalisable if $\#Spec(L) = \dim(V) = 3$, meaning that $L$ must have 3 distinct eigenvalues. I imagine I have to find the eigenvalues of $L$, but, I'm not really sure where to go from here. Any help is appreciated.

4

There are 4 best solutions below

8
On BEST ANSWER

Note that $\operatorname{Id}_n\colon\mathbb R^n\longrightarrow\mathbb R^n$ has a single eigenvalue, but it is diagonalizable nevertheless. So, no, you don't need to have $3$ distinct eigenvalues.

On the other hand:

  • $L(1)=\alpha\times1$;
  • $L(X)=(\alpha+1)X+1$;
  • $L(X^2)=2X+(\alpha+2)X^2$.

Therefore, the matrix of $L$ with respect to the basis $\{1,X,X^2\}$ is$$\begin{bmatrix}\alpha&1&0\\0&\alpha+1&2\\0&0&\alpha+2\end{bmatrix}.$$Can you take it from here?

2
On

The image of basis $(1,X,X^2)$ is $(\alpha, (\alpha+1)X+1,(\alpha+2) X^2+2X)$ therefore with matrix representation :

$$M=\begin{pmatrix}\alpha&1&0\\0&\alpha+1&2\\0&0&\alpha+2\end{pmatrix}$$

As $M$ is triangular, its diagonal entries are its eigenvalues :

$$\alpha<\alpha+1<\alpha+2$$

Being all distinct, whatever $\alpha$, this matrix is always diagonalizable.

0
On

One can also approach this directly using the definition of eigenvector and basic facts about polynomials. If $Q$ is an eigenvector of $L$, by definition we have $$L(Q(X)) = \alpha Q(X) + (X + 1) Q'(X) = \lambda Q(X)$$ for some constant $\lambda$, and rearranging gives $$(X + 1) Q'(X) = (\lambda - \alpha) Q(X) .$$ If $Q'(X) = 0$, then $Q$ is constant and $\lambda = \alpha$, giving one eigenvalue. If $Q'(X) \neq 0$, then $Q(X)$ is divisible by $(X + 1)$, so $Q(X) = R(X) (X + 1)$ for some $R$ with $\deg R \leq 1$. Substituting into the previous display equation and cancelling gives $$R(X) + R'(X) (X + 1) = (\lambda - \alpha) R(X) ,$$ and rearranging gives $$(X + 1) R'(X) = (\lambda - (\alpha + 1)) R(X) .$$

Now, either $R'(X) = 0$, in which case $R$ is constant and $\lambda = \alpha + 1$, or $R$ a (now constant) multiple of $X + 1$, in which case $Q(X) = R(X) (X + 1)$ is, up to a nonzero multiple, $(X + 1)^2$, and substituting again gives that $\lambda = \alpha + 2$. We've now found that $L$ has distinct eigenvalues, namely, $\alpha, \alpha + 1, \alpha + 2$, and that with respect to the basis $(1, X + 1, (X + 1)^2)$, $L$ has the matrix representation $$[L] = \pmatrix{\alpha\\&\alpha + 1\\&&\alpha + 2} .$$

1
On

We may as well replace $1$ with an arbitrary constant $-r$. Suppose $Q$ is an eigenvector of $L$, say of eigenvalue $\lambda$. Rearranging gives $$(X - r) Q'(X) = (\lambda - \alpha) Q(X) .$$ But this is a separable ordinary differential equation. Since polynomials are analytic, it suffices to work on some nonempty open interval not containing $r$ (or more to the point, where $X - r$ is nonvanishing). Then, dividing gives $$\frac{\lambda - \alpha}{X - r} = \frac{Q'(X)}{Q(X)} = (\log Q)'(X),$$ and integrating gives that up to a constant multiple we have $$P(X) = (X - r)^{\lambda - \alpha}.$$ This function is an element of $R[X]_{\leq 2}$ precisely when $\lambda - \alpha \in \{0, 1, 2\}$. (Note that this solves efficiently the corresponding problem for the operator $L$ defined on the polynomial space $R[X]_{\leq d}$ for large $d$.)