How to compute the limit of a distribution

67 Views Asked by At

I would like to know if the following sequence : $ T_n = \displaystyle \sum_{k=1}^n \dfrac{1}{k^{2}} \delta_{\frac{1}{k}} $ converge in $ \mathcal{D} ' ( \mathbb{R} ) $. If it's converge in $ \mathcal{D} ' ( \mathbb{R} ) $, we must compute its limit. Thanks a lot.

1

There are 1 best solutions below

2
On

Let $\phi\in\mathcal{D}(\mathbb{R})$ de a test function. Then $$ T_n(\phi)=\sum_{k=1}^N\frac1{k^2}\,\phi\Bigl(\frac1k\Bigr). $$ Since $\phi$ is bounded, the series is absolutely convergent, that is $$ \lim_{N\to\infty}T_n(\phi)=\lim_{N\to\infty}\sum_{k=1}^N\frac1{k^2}\,\phi\Bigl(\frac1k\Bigr)=\sum_{k=1}^\infty\frac1{k^2}\,\phi\Bigl(\frac1k\Bigr). $$ Define $$ T(\phi)=\sum_{k=1}^\infty\frac1{k^2}\,\phi\Bigl(\frac1k\Bigr). $$ Then $t$ is a distribution and $T_n$ converges to $T$ in the distribution sense.