Introduction to the theory of distributions, Friedlander, Exercise 1.7

73 Views Asked by At

Here is the question:

enter image description here

(Sorry that I don't know how to type Latex, so I include the question here as an image.)

I have proved the first part but I am stuck on the second. The question seems to ask me to prove that (b) holds for all epsilon between 0 and 1.

The fact that the integral of (b) tends to 1 is obvious as epsilon goes to 0 but I don't know how to prove (b) for all epsilon between 0 and 1.

Anyone have an idea?

Thanks a lot.

1

There are 1 best solutions below

0
On

That implication is obvious nonsense.

Say $(f_\epsilon)$ satisfies (a) and $f_\epsilon\to\delta$. Define $$g_\epsilon=\begin{cases}2f_\epsilon&(\epsilon=1/2),\\f_\epsilon,&(\epsilon\ne1/2).\end{cases}$$Then $(g_\epsilon)$ also satisfes (a) and $g_\epsilon\to\delta$, but $(f_\epsilon)$ and $(g_\epsilon)$ cannot both satisfy (b).