limit of a distribution

263 Views Asked by At

Show that $$\lim_{\epsilon \to 0^{+}} \left\langle \frac{\epsilon}{x^{2}+ \epsilon} ,\phi \right\rangle =\langle \delta,\phi\rangle $$ where $\phi\in D(\mathbb{R}) $ and $ \frac{\epsilon}{x^{2}+ \epsilon} \in D' (\mathbb{R}) $

What I tried is this: $ \frac{\epsilon}{x^{2}+ \epsilon}$ is an integrable function hence $\left\langle \frac{\epsilon}{x^{2}+ \epsilon} ,\phi\right\rangle$ defines an integral on $\mathbb{R}$:

$$\left\langle\frac{\epsilon}{x^{2}+ \epsilon} ,\phi\right\rangle = \int_{- \infty}^{\infty}\frac{\epsilon}{x^{2}+ \epsilon} \phi (x) dx$$

How can I take limit of the last equation?

Thank you from now on.

1

There are 1 best solutions below

0
On BEST ANSWER

As some people pointed to you you should consider the delta sequence $$ f_n(x) = \frac{n}{\pi}\frac{1}{1+n^2x^2}$$ Which gives

$$\lim_{n\to \infty} <f_n(x), \phi(x) > =<\delta(x), \phi(x) >$$

Note: you can put $\epsilon =\frac{1}{n}$ and consider the limit as $\epsilon$ goes to 0.