Show that there aren't negative eigenvalues.

435 Views Asked by At

I've been trying to solve this Sturm-Liouville theory problem.

Show that the problem:

$$\left\{\begin{matrix} y''+(x+\lambda)y = 0\\ y(0)=0\\y(1)=0\end{matrix}\right.$$

doesn't have nontrivial solutions if $\lambda<0$.

It actually asks to show that the operator $L[y]:=y''+xy$ just doesn't have negative eigenvalues. I've tried the trick of multiplying the equation by $y$, and integrating, but it doesn't seem to work.

An alternative approach or something will be thanked.