$\mathcal{L}$ has no negative eigenvalues. Provide conditions on $p, q, r$ such that that above result generalizes to this $\mathcal{L}$.

103 Views Asked by At

(a) Consider the operator $\mathcal{L}=-d^2 / d x^2$ on $x \in[a, b]$. Prove that if we choose boundary conditions (BCs) such that $$ \left.f(x) f^{\prime}(x)\right|_{x=a} ^{x=b} \leq 0 $$ for all (real-valued) $f(x) \in \mathcal{D}(\mathcal{L})$ that satisfy the BCs, then $\mathcal{L}$ has no negative eigenvalues.

(b) Suppose that $\mathcal{L}=p(x) d^2 / d x^2+q(x) d / d x+r(x)$ for continuous functions $p, q, r$ on $[a, b]$. Provide conditions on $p, q, r$ such that that above result generalizes to this $\mathcal{L}$.

I am reading Teschl and trying to solve some questions from arbitrary problems posted online and I am getting stuck almost everywhere. Teschl is so tough for me. I can hopefully get some understanding with your help. If you have some hint or way out from here. Please let me know. Also if you feel there is a lucid book where I get help of subject and problems. Let me know.

2

There are 2 best solutions below

2
On BEST ANSWER

Answer for (b): Suppose $\mathcal Lf=\lambda f$, then (using integration by parts) \begin{align} \lambda\int^b_af^2dx&=\int^b_af\mathcal Lf\;dx=\int^b_a(pff''+qff'+rf^2)dx\\ &=[pff']^b_a+\int^b_a(-(pf)'f'+qff'+rf^2)dx\\ &=[pff']^b_a+\int^b_a(-p(f')^2+(q-p')ff'+rf^2)dx \end{align} If $p'=q$ and $p\leq 0$ and $r\geq 0$ and $p(a)=p(b)$, then we see that $\lambda\int^b_af^2dx\geq 0$, thus $\lambda\geq 0$. Hence $\mathcal L$ cannot have negative eigenvalues. Part (a) is just a special case of this.

Note that the condition $p'=q$ allows the operator to be written in divergence form $$\mathcal Lg={d\over dx}\left(p(x){dg\over dx}\right)'+r(x)g$$ which is the form Sturm–Liouville operators are usually assumed to take. This condition makes sure the operator is symmetric (assuming the right boundary conditions holds) in the sense that $\int g_1\mathcal Lg_2dx=\int g_2\mathcal Lg_1dx$.

1
On

Part (a): \begin{align} \int_a^b(\cal{L}f)fdx & =\int_a^b(-f'')fdx \\ &=\int_a^b(-f'f)'+f'f'dx \\ &=-f'f|_a^b+\int_a^b f'f'dx \\ &\ge \int_a^b f'f'dx \end{align} If $\mathcal{L}f=\lambda f$, then applying the above results in $\lambda\int_a^b f^2 dx \ge \int_a^b (f')^2dx \ge 0$, which forces $\lambda \ge 0$.