Convergence of extensions

40 Views Asked by At

Let $v_n\in L^2(\mathbb{R}^{n}_+)$ and suppose it converges to $v\in L^2(\mathbb{R}^{n}_+)$. Furthermore let $v_n$ be such that $(I-\Delta) v_n=0$. Here we interprete $(1-\Delta):\mathcal{D_{(+)}}'\rightarrow \mathcal{D_{(+)}}'$, where $\mathcal{D_{(+)}}$ is the collection of test functions on $\mathbb{R^n_{(+)}}$ and $\mathcal{D'_{(+)}}$ is the collection of distributions. Read both interpretations with and without the plus sign.

The sequence$(I-\Delta) v_n$ obviously also converges to zero in $L^2(\mathbb{R}^{n}_+)$. Define now $E(u)\in L^2(\mathbb{R})$ to be the continuous, from $L^2(\mathbb{R}^{n}_+)$ to $L^2(\mathbb{R}^{n})$, extension of any $u\in L^2 (\mathbb{R}^{n}_+)$ such that its value is zero outside of $\mathbb{R^n_+}$. I need to show with rigour that $(1-\Delta) E(v_n)$ as a distribution on $\mathbb{R}^n$ is also in $L^2(\mathbb{R}^n)$ and converges in that space (to zero).

Kindly appreciated,

Aris