Eigenvalues of Linear map of the 2nd derivative

810 Views Asked by At

Let $V$ be the space of all $\textbf{bounded}$ functions: $\mu: \mathbb{R} \rightarrow \mathbb{R}$ having derivatives of all orders. Define an operator $T: V \rightarrow V$ by $T\mu = -\frac{\partial^{2}{\mu}}{\partial t^{2}}$.

Find all the eigenvalues of $T$.


Suppose you have some polynomial $p(x) = \alpha_{0} + \alpha_{1}x _{1}+ \alpha_{2}x_{2} + ... + \alpha_{n}x_{n} $

Okay, so to get the linear operator for the second derivative, I am thinking it looks like this:

\begin{pmatrix} \frac{2!}{0!}\times\frac{\alpha_{2}}{\alpha_{0}} & 0 & ... & ... & ... & ... & 0 \\ 0 & \frac{3!}{1!}\times\frac{\alpha_{3}}{\alpha_{1}} & ... & ... & ... & ... & 0 \\ 0 & 0 & \frac{4!}{2!}\times\frac{\alpha_{4}}{\alpha_{2}} & ... & ... & ... \\ 0 & ... & 0 & \ddots & ... & ... & \vdots & \\ 0 & ... & ... & ... & \frac{n!}{n-2!}\times\frac{\alpha_{n}}{\alpha_{n-2}} & ... \\ 0 & ... & ... & ... & ... & 0 & 0 \\ 0 & ... & ... & ... & ... & 0 & 0 \\ \end{pmatrix}

So it gives $\lambda_{1} = 2\cdot\frac{\alpha_{2}}{\alpha_{0}} \lambda_{2} = 6\cdot\frac{\alpha_{3}}{\alpha_{1}}, ..., \lambda_{n-2} = n\times(n-1)\cdot\frac{\alpha_{n}}{\alpha_{n-2}}, \lambda_{n-1} = \lambda_{n} = 0$.

Is this correct?

The question did not specify a polynomial function, I simply assumed it. Is this assumption valid?

2

There are 2 best solutions below

0
On

Hint: If $\mu$ is an eigenvector, then

$$T\mu=\lambda\mu\Longrightarrow-\dfrac{\partial^{2}\mu}{\partial t^{2}}=\lambda\mu$$

That's an ODE, can you solve it?

0
On

No, assuming the eigenvectors are polynomials is not a valid assumption. You'll miss eigenvectors that don't correspond to $\lambda = 0$. For example, $\sin$ is an eigenvector corresponding to $\lambda = 1$.

Ultimately, to find the eigenvectors corresponding to eigenvalue $\lambda$, you need to find a general solution to the differential equation:

$$Ty = -\lambda y \iff y'' = -\lambda y$$

I'd recommend splitting the equation into three cases: $\lambda > 0$, $\lambda < 0$, and $\lambda = 0$. The simplest is the latter. If $\lambda = 0$, then $$y'' = 0 \iff y' = C \iff y = Cx + D$$ for some $C, D$. This gives you a two-dimensional eigenspace: $P_1(\Bbb{R}$).

If $\lambda > 0$, then the solutions will take the form $$y = A\sin\left(\sqrt{\lambda}x\right) + B\cos\left(\sqrt{\lambda}x\right),$$ which is again a two-dimensional subspace: $$\operatorname{span}\left\{\sin\left(\sqrt{\lambda}x\right), \cos\left(\sqrt{\lambda}x\right)\right\}$$

If $\lambda < 0$, then the solutions will take the form $$y = Ae^{\sqrt{-\lambda} x} + B e^{-\sqrt{-\lambda}x}$$ which again gives us a two-dimensional subspace $$\operatorname{span}\left\{e^{\sqrt{-\lambda} x}, e^{-\sqrt{-\lambda}x}\right\} = \operatorname{span}\left\{\sinh\left(\sqrt{-\lambda} x\right), \cosh\left(\sqrt{-\lambda}x\right)\right\}.$$