I am currently trying the following problem. Unfortunately, I haven't got any idea for days. Would be really grateful for help.
Let $\Omega \subseteq \mathbb{R}^2$ be a bounded polygon and consider the eigenvalues of the problem $ -\Delta u = \lambda u, u = 0 \quad \text{on} \quad \partial \Omega.$ Weyl’s asymptotic formula shows that $\lim_{\lambda \to \infty} \# \{n \in \mathbb{N} | \lambda_n \leq \lambda \}/ \lambda = |\Omega|/(4 \pi)$. With $E_n := P^0(\mathcal{T}_h)$, where $\mathcal{T}_h$ is a uniform mesh with mesh size $h > 0$ and $n = \# \mathcal{T}_h$, what is the convergence rate (error vs. degrees of freedom) of $\inf_{v \in E_n} \Vert u - v \Vert_{L^2(\Omega)}$. Show that this rate can, in general, not be improved. (The only information you have about $u$ is that $u \in H_0^1(\Omega).$)
I got the hint to use the following result (I could prove it):
Consider an ONB $(u_n)_{n \in \mathbb{N}}$ of a Hilbert space $X$ and a sequence $0 < \lambda_1 \leq \lambda_2 \leq \dots \to \infty $. Define the set $$ B:= \left \{ \sum_{j=1}^{\infty}\alpha_j u_j : \alpha_j \in \mathbb{R}, \sum_{j=1}^{\infty} |\alpha_j|^2 \lambda_j \leq 1 \right \}. $$ It holds: $$ \inf_{\substack{E_n \subseteq X \\ dim(E_n)=n}} \sup_{u \in B} \inf_{v \in E_n} \Vert u-v \Vert_X = \lambda_{n+1}^{-1/2}. $$ So I want to show that the following applies to the Helmholtz problem:
$\inf_{v \in E_n} \Vert u - v \Vert_{L^2(\Omega)} = C*\frac{1}{\sqrt{n}}$
If I understood that correctly. Bramble-Hilbert should simply give the "upper estimate".
The "downward estimation" becomes problematic, which is probably the crux of the matter. It is not clear to me how I can use the hint (especially set B) here.
Would be very grateful for ideas.
If someone still needs the solution:
First we will prove $$ \inf_{\substack{E_n \subseteq X \\ dim(E_n)=n}} \sup_{u \in B} \inf_{v \in E_n} \Vert u-v \Vert_X \leq \lambda_{n+1}^{-1/2}. $$ For that we choose $E_n = V_n = span\{u_1, \dots, u_n\}$, this implies that for every $v \in V_n$ there exists a unique vector $(\mu_1, \dots, \mu_n) \in \mathbb{R}^n$, so that $$ v = \sum_{i=1}^{n} \mu_i u_i.$$ Now we compute $$ \Vert u-v \Vert_X^2 = \left \Vert \sum_{i=1}^{\infty} \alpha_i u_i - \sum_{i=1}^{n} \mu_i u_i \right \Vert_X^2 = \left \Vert \sum_{i=n+1}^{\infty} \alpha_i u_i + \sum_{i=1}^{n} (\alpha_i - \mu_i) u_i \right \Vert_X^2 = \sum_{i=1}^{n} (\alpha_i - \mu_i)^2 + \sum_{i=n+1}^{\infty} \alpha_i^2,$$ the last equality holds by Parsevals identity. This yields $$ \inf_{v \in V_n} \Vert u-v \Vert_X = \inf_{(\mu_i) \in \mathbb{R}^n} \left (\sum_{i=1}^{n} (\alpha_i - \mu_i)^2 + \sum_{i=n+1}^{\infty} \alpha_i^2\right )^{1/2} = \left (\sum_{i=n+1}^{\infty} \alpha_i^2\right )^{1/2}.$$ Eventually $$\sup_{u \in B} \inf_{v \in E_n} \Vert u-v \Vert_X = \sup_{u \in B} \left (\sum_{i=n+1}^{\infty} \alpha_i^2\right )^{1/2} = \left (\sup_{u \in B} \sum_{i=n+1}^{\infty} \alpha_i^2\right )^{1/2} = \left (\frac{1}{\lambda_{n+1}} \right)^{1/2}.$$ The last equality is true because $$ \sum_{j=1}^{\infty} \alpha_j^2 \lambda_j \leq 1 $$ implies $$ \sum_{j=1}^{\infty} \alpha_j^2 \frac{\lambda_j}{\lambda_{n+1}} \leq \frac{1}{\lambda_{n+1}} $$ and the monotonicity of $(\lambda_j)_{j \in \mathbb{N}}$ implies $$ \sum_{j=n+1}^{\infty} \alpha_j^2 \leq \frac{1}{\lambda_{n+1}}. $$ Choosing $(\alpha_j)_{j \in \mathbb{N}} = (0,0, \dots,0,0, \lambda_{n+1}^{-1/2},0,0,0,0\dots)$ yields $$ \sup_{u \in B} \sum_{i=n+1}^{\infty} \alpha_i^2 = \frac{1}{\lambda_{n+1}}.$$
It remains to prove $$ \inf_{\substack{E_n \subseteq X \\ dim(E_n)=n}} \sup_{u \in B} \inf_{v \in E_n} \Vert u-v \Vert_X \geq \lambda_{n+1}^{-1/2}.$$ Let $E_n \subset X$ an arbitrary finite dimensional subspace with $dim(E_n)=n \in \mathbb{N}.$ We have to show that $$ \exists u \in B \forall v \in E_n : \frac{1}{\lambda_{n+1}} \leq \Vert u-v \Vert_X^2.$$ $u \in B$ has the representation $$ u=\sum_{i=1}^{\infty}\alpha_iu_i, \quad \sum_{i=1}^{\infty} \alpha_i^2 \lambda_i \leq 1.$$ We choose $(\alpha_1,\dots,\alpha_{n+1})$ so that $$ \hat{u} := \sum_{i=1}^{n+1} \alpha_i u_i \in E_n^{\perp}, \quad \frac{1}{\lambda_{n+1}} \leq \sum_{i=1}^{n+1} \alpha_{i}^2, \quad \text{and} \quad \alpha_i = 0 \quad \text{for} \quad i > n+1.$$ This yields $$ \Vert u-v \Vert_X^2 = \Vert \hat{u} - v \Vert_X^2 = \Vert \hat{u} \Vert_X^2 + \Vert v \Vert_X^2 \geq \Vert \hat{u} \Vert_X^2 = \sum_{i=1}^{n+1} \alpha_{i}^2 \geq \frac{1}{\lambda_{n+1}}.$$