Question about error analysis, and why the choice of the norm is important

37 Views Asked by At


I'm currently studying approximation methods for variational formulation of some elliptic problem for example $-\Delta u = f$.
When describing error estimation for a spectral method, we have the following estimation if $u\in H^m$ and $f\in H^r$ $$\| u - u_n \|_{H^1} \leq c (n^{1-m} \|u\|_{H^m} + n^{-r} \|f\|_{H^r}). $$ But my issues is when $m=1$, then we know that $u_n$ cannot converge to $u$ in $H^1$ (the estimation is known to be optimal).
Now we have this second estimation error, in $L^2$-norm $$\|u-u_n\|_{L^2} \leq c(n^{-m}\|u\|_{H^m} + n^{-r} \|f\|_{H^r}). $$ So we always have that $u_n$ converge to $u$ in $L^2$. Now my question is : Does this really matter in practice when we perform the actual computation on a computer ? I know that we are looking for a solution in $H^1$, so it should be natural to measure the error with the $H^1$ norm. But if we compute an approximate solution $u_n$ that converge to $u$ in $L^2$, isn't it enough to do our practical work when we want to compute the solution of a pde ?