How do these two definitions of uniformly elliptic fit together?

1.4k Views Asked by At

Consider a bounded domain $\Omega\subset\mathbb{R}^2$. We defined that a semi-linear PDE of degree 2 (whereat $A=(a_{ij})$ is the coefficient matrix of the main part of the PDE and $a_{ij}=a_{ji}$) is called uniformly elliptic if there exists a $0<\lambda\leq\Lambda<\infty$ so that for all $x\in\Omega$ and for all $\xi\in\mathbb{R}^2$ $$ \lambda\xi^2\leq \underbrace{\sum_{i,j=1}^{n}a_{ij}\xi_i\xi_j}_{=a(\xi,\xi)}\leq\Lambda\xi^2, \xi^2:=\sum_{i=1}^{n}\xi_i^2. $$ Other people define it in a different way, i.e. saying that the PDE is uniformly elliptic if there is a positive common lower boundary for all Eigenvalues of $A$ and a common upper boundary for the Eigenvalues of $A$.

Could you please explain me, why these two definitions are equivalent?

I do not see that - unfortunately.

But I was told, they are equivalent, because the minimum and the maximum of $$ \frac{a(\xi,\xi)}{\xi^2} $$ are Eigenvectors.

But I do not understand this! Maybe you can explain me that.

Sincerely yours,

math12

2

There are 2 best solutions below

0
On

Let $A$ be a $n\times n$ symmetric matrix. Consider the function $f:\mathbb{R}^n\to\mathbb{R}$ defined by $$f(\xi)=\frac{\langle A\xi,\xi\rangle}{|\xi|^2}$$

Note that $f(c\xi)=f(\xi)$ for all $c>0$, hence there is no loss of generality in minimize the function $f$ over the set $S=\{x\in\mathbb{R}^n:\ \|x\|^2=1\}$. For this end, we use Lagrange Multipliers.

Define $L(\xi,\lambda)=\langle A\xi,\xi\rangle+\lambda(1-\|\xi\|^2)$. Because $f$ is continuous and $S$ is compact, we have a maximum and a minimum over $S$. Assume for example that $x$ is the point where the minimum is assumed, hence, there is $\lambda$ such that $$\nabla L(x,\lambda)=0\tag{1}$$

We conclude from $(1)$ that $$\langle Ax,\eta\rangle+\langle A\eta,x\rangle-2\lambda\langle x,\eta\rangle=0,\ \forall\eta\in\mathbb{R}^n\tag{2}$$

We use the symmetry of $A$ to conclude from $(2)$ that $$Ax=\lambda x$$

Moreover, note that $$f(x)=\langle Ax,x\rangle =\lambda$$

Therefore the least eigenvalue of $A$ is the minimum value of $A$. Now apply these ideas to your problem.

1
On

Another way to understand this is that

Since $a_{ij}$ is symmetric and real valued, you can apply the the spectral theorem to it.

In particular there is an orthonormal basis of $\mathbb{R}^n$ consisting of eigenvectors of $a_{ij}$. Let us call them $v^{(\ell)}$ with eigenvalues $\lambda^{(\ell)}$, where $\ell$ runs from $1$ to $n$.

Since any vector is uniquely decomposed in terms of a basis, you can write $\xi_i = \sum \xi_{(\ell)} v^{(\ell)}_i$ where $\xi_{(\ell)}$ are real coefficients. We then have

$$ \sum a_{ij} \xi_i \xi_j = \sum_\ell |\xi_{(\ell)}|^2 \lambda^{(\ell)} $$

by the eigen-decomposition.

Note also that

$$ \sum \xi_i \xi_i = \sum_{\ell} |\xi_{(\ell)}|^2 $$

from this the equivalency of the two definitions should be clear.