convergence of a sequence in distributions

70 Views Asked by At

My problem is

$u_n(x)=ne^{inx}$ converges to zero in $\mathcal{D}'(\mathbb{R})$?

Let $\varphi \in C^{\infty}_{c}(\mathbb{R})$, \begin{align*} \langle u_n, \varphi \rangle&=\int_{\mbox{supp}\, \varphi} u_n (x) \varphi(x)\, dx\\ &=\int e^{iy} \varphi(\frac{y}{n})\, dy\\ & \rightarrow \int e^{iy}\, \varphi(0)\, dy\\ &=\int e^{iy}\, dy \, \langle \delta, \varphi \rangle. \end{align*}

$\int e^{iy}\, dy=0 \quad \mbox{or} \quad \ne 0$? I am confused about it.

Thank you.

1

There are 1 best solutions below

0
On BEST ANSWER

$$f_n(x)=\frac{e^{inx}}{-n}$$ It is obvious $f_n\to 0$ in the sense of distributions thus so does $f_n''(x)=n e^{inx}$

(by definition of the distributional derivative as $<f_n'',\varphi>= <f_n,\varphi''>$)

This is indeed the point of the distribution topology : making the linear map $u \mapsto u'$ continuous.