The limit at infinity of a function in $L^2$

561 Views Asked by At

Suppose that $f'$ is locally absolutely continuous and $f$, $f''$ $ \in L^{2}(0,\infty)$

Prove that $$\lim_{x\rightarrow\infty}[f^{2}(x)+2f(x)f'(x)+f'^{2}(x)]=0.$$

Actually, I proved that the limit exists but could not prove that it should be zero.

1

There are 1 best solutions below

5
On BEST ANSWER

Hint: using the assumption on $f,f'$ and the integral

$$ \int_0^x \frac{d}{dt}(f(t)+f'(t))^2dt $$

show that the limit is finite. Then argue that if the limit is not 0, then at least one between $f$ and $f'$ cannot be in $L^2$.

Edit: The function $g=f^2+ff'+f'^2\geq 0$ is in $L^1(0,\infty)$ because $f,f'$ are in $L^2(0,\infty)$. Now, $g$ being integrable does not imply that the limit of $g$ at infinity is zero, since the limit may not exist; it does however imply that the $\lim\inf$ at infinity is zero. In particular, if you know that the limit exists, then it must be zero (since it must equal the $\lim\inf$). To show this, argue by contradiction: if the limit is, say, $c>0$, then, for $x$ large enough, say $x>M$, $g(x)>c/2>0$. But then, $\int_M^\infty g(x)$...