I am stack with the following problem: Consider the following eigenvalue problem $$ u \in H_B(0,1), \; \langle Lu, Lv\rangle = \lambda (\alpha \langle u, v\rangle + \langle u', v'\rangle) \; \forall v\in H_B(0,1),$$ where $$H_B(0,1) = \begin{cases} \{u \in H^2(0,1): u(0) = u(1) = 0\},&\text{if}\quad B = 0,\\ \\ \{u \in H^2(0,1): u'(0) = u'(1) = 0\},& \text{if}\quad B = 1, \end{cases} $$ $$Lu:= -du'' + cu, \; d>0, \;c\in C[0,1] $$ and $$\alpha>0.$$ I need to show that the spectrum of the problem above consists of eigenvalues which are real and converge to infinity.
My initial thought was it to use the spectral theorem for the compact self-adjoint operators. After rewriting the problem above in strong formulation as \begin{equation} L^\ast L u = \lambda (\alpha u + u''), \; u \in \tilde{H}_B(0,1), \end{equation} where \begin{equation} \tilde{H}_B(0,1) = \begin{cases} \{u \in H^4(0,1): u(0) = u(1) = 0\},&\text{if}\quad B = 0,\\ \\ \{u \in H^4(0,1): u'(0) = u'(1) = 0\},& \text{if}\quad B = 1, \end{cases} \end{equation} I wanted to show that the operator $(L^\ast L + \sigma I)^{-1}$ is compact and self-adjoint. If this is the case, then from the spectral theorem one can conclude that there exists the orthonormal basis $\varphi_k$ of $L^2(0,1)$ of the eigenfunctions of $(L^\ast L + \sigma I)^{-1}$ and the corresponding sequence of eigenvalues $\mu_k$ is real and converges to $0$.
Honestly, I don't really know how to proceed with this problem further and would appreciate any help and useful hints. I think, that showing self-adjointness is not problematic, but how to show compactness? I feel like I should consider an operator of the form $E(L^\ast L + \sigma I)^{-1} : L^2(0,1)\to L^2(0,1)$ where $E$ is a suitable compact embedding? Am I on the right track here? Is considering the strong formulation of the above eigenvalue problem is a good idea in general?
Thank you very much for your help in advance!