Sign of the eigenvalues of the Laplacian

97 Views Asked by At

I have to prove that, given the problem$$ \begin{cases} \Delta\:g+ \lambda \:g=0\quad {\rm in}\;D \\ g=0\quad {\rm on} \; D\end{cases}$$ then the eigevalues $\lambda>0$. I multiply the first equation by $g$ and then, using the divergence theorem, I obtain $$ -\int_{D}{\lvert \nabla g\rvert ^2 dx}+\lambda\:\int_{D}{g^2 dx}=0 \qquad(1)$$ and so, if $\lambda\leq0$ , I have an absurd. How can I obtain $(1)$ using Green's theorem, instead of the divergence theorem?

1

There are 1 best solutions below

0
On BEST ANSWER

I'm not sure that we are talking about the same Green's theorem, I think that there are many version of it. In my experience Green's theorem is formulated in $\mathbb{R}^2$ and it is equivalent to the divergence theorem in dimension 2. http://en.wikipedia.org/wiki/Green's_theorem#Relationship_to_the_divergence_theorem