Solve eigenvalue problem $(\frac{u'}{x})'+\frac{\lambda}{x}u=0$

72 Views Asked by At

Consider the eigenvalue $(\frac{u'}{x})'+\frac{\lambda}{x}u=0$,$x\in (1,2)$.

And $u(1)=u(2)=0$.

I want to determine the sign of the eigenvalues first. But since it is not a standard eigenvalue problem in the form of $x''+\lambda x=0$. I can not seem to just check if it satisfies symmetric boundary condition. I tried to perform some transformations on $u$ to make the equation more regular but failed.

Any hints on this would be appreciated.

2

There are 2 best solutions below

0
On BEST ANSWER

It might be useful. We have $$(\frac{u'}{x})'+\frac{\lambda}{x}u=0,$$ let $xz = u$, hence we rewrite the equation as it follows $$\frac{z'}{x} - \frac{z}{x^2} + z'' +\lambda z = 0,$$ multiplying by $x^2$ we obtain, $$x^2z''+xz'+(\lambda x^2 - 1)z =0,$$ let $t = \sqrt{\lambda} x$ $$t^2z''(t) + tz'(t)+(t^2-1)z(t)=0,$$ it is a Bessel's differential equation. So, $$z(t) = AJ_1(t) + BY_1(t),$$ and therefore $$u(x) = AxJ_1(\sqrt{\lambda}x)+BxY_1(\sqrt{\lambda}x).$$

0
On

If you let $u(x)=\sum a_nx^n$, you find $a_n\neq0$ for even $n$. So let $v(x^2)=u(x)$ and find the DE that $v(y)$ satisfies.