$\lim\limits_{t\rightarrow\infty}\int\limits_{E}\phi(x+t)dx=0$

89 Views Asked by At

Suppose $E\subset\mathbb{R}$ has finite Lebesgue measure and $\varphi\in L^1(\mathbb{R})$. Show that $\lim\limits_{t\rightarrow\infty}\int\limits_{E}\varphi(x+t)dx=0$.

I guess first I have to show $\lim\limits_{t\rightarrow\infty}\varphi(x+t)=0$ for $x\in\mathbb{R}$ and use the Lebesgue Dominated Convergence Theorem but I am not sure.

2

There are 2 best solutions below

2
On BEST ANSWER

Given $\epsilon>0$, choose a $\psi\in C_{00}$ such that $\|\varphi-\psi\|_{L^{1}(\mathbb{R})}<\epsilon$, then \begin{align*} \int_{E}|\varphi(x+t)|dx&\leq\int_{E}|\varphi(x+t)-\psi(x+t)|dx+\int_{E}|\psi(x+t)|dx\\ &\leq\|\varphi-\psi\|_{L^{1}(\mathbb{R})}+\int_{E}|\psi(x+t)|dx. \end{align*} Now $|\psi(x+t)|\chi_{E}\leq\|\psi\|_{L^{\infty}}\chi_{E}\in L^{1}(\mathbb{R})$, apply Lebesgue Dominated Convergence Theorem to make the term $\displaystyle\int_{E}|\psi(x+t)|dx$ arbitrarily small as $t\rightarrow\infty$.

0
On

Note that $\int \phi(x+t)1_E(x) d x = \int \phi(x)1_E(x-t) d x$

If $E$ is a bounded set then $\lim_{t \to \infty} 1_E(x-t) = 0$ and so $\int \phi(x)1_E(x-t) d x = 0$.

Since $\phi$ is integrable, for any $\epsilon>0$, there is some $\delta>0$ such that if $mA < \delta$ then $\int_A|\phi| < \epsilon$ (absolute continuity).

Suppose $\epsilon>0$. Choose the $\delta>0$ as above.

If $E$ has bounded measure, then we can write $E = B \cup C$, where $B$ is bounded and $mC < \delta$.

Then $\int \phi(x+t)1_E(x) d x = \int \phi(x+t)1_C(x) d x + \int \phi(x)1_B(x-t) d x< \epsilon + \int \phi(x)1_B(x-t) d x$ and there is some $T$ such that if $t>T$ we have $\int \phi(x)1_B(x-t) d x < \epsilon$. Hence $\int \phi(x+t)1_E(x) d x \to 0$.