Suppose $f : (0,\infty) \to \mathbb{R}$ is locally $H^1$ and $\int_0^\infty (|f'(t)|^2 + |f(t)|^2) e^{-t} dt$ is finite. Is then $\lim_{t \to \infty} e^{-t} |f(t)|^2 = 0$?
Asymptotics of a Sobolev function on unbounded interval
83 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Yes, it is. To show this, denote $$ M=\int_0^\infty \bigl(|f'(t)|^2 + |f(t)|^2\bigr) e^{-t} dt\tag{1} $$ and observe that function $g(t)=f(t)e^{-t/2}$ satisfies inequality $$ \|g\|^2_{L^2(0,\infty)}=\int_0^{\infty}|g(t)|^2dt=\int_0^{\infty}|f(t)|^2e^{-t}dt \leqslant M, \tag{2} $$ while $|g'(t)|^2= \bigl(f'(t)e^{-t/2}-g(t)/2\bigr)^2\leqslant 2|f'(t)|^2e^{-t}+|g(t)|^2/2$, whence by $(1),(2)$ follows $$ \|g\|^2_{H^1(0,\infty)}=\|g\|^2_{L^2(0,\infty)}+\|g'\|^2_{L^2(0,\infty)} \leqslant 2M. $$ Sobolev space $H^1(0,\infty)$ is known to consist of uniformly absolutely continuous on $[0,\infty)$ functions vanishing at infinity, which immediately implies $$ \lim_{t\to\infty}e^{-t/2}f(t)=\lim_{t\to\infty}g(t)=0. $$
Remark. It is not difficult to verify that all functions $g\in H^1(0,\infty)$ vanish at infinity. Indeed, denote by $\widetilde{g}$ an even extension of $g$ from $(0,\infty)$ to $\mathbb{R}$, and notice that $\widetilde{g}\in H^1(\mathbb{R})$. By virtue of Plancherel's theorem, Fourier transform $\widehat{g}=F[\widetilde{g}]$ satisfies inequality $$ \|\widehat{g}\|^2_{L^1{(\mathbb{R})}}\leqslant \int_{\mathbb{R}}\bigl(1+{\xi}^2\bigr)|\widehat{g}(\xi)|^2 d\xi \cdot\!\!\int_{\mathbb{R}}\frac{d\xi}{1+{\xi}^2} =2{\pi}^2\|\widetilde{g}\|^2_{H^1(\mathbb{R})}= 4{\pi}^2\|g\|^2_{H^1(0,\infty)}\,, $$ i.e., $\|\widehat{g}\|_{L^1{(\mathbb{R})}}\leqslant 2\pi \sqrt{M}<\infty$. And finally, the Riemann-Lebesgue lemma (http://en.wikipedia.org/wiki/Riemann%E2%80%93Lebesgue_lemma) implies that $$ \lim_{t\to\infty}g(t)=\lim_{t\to\infty}\widetilde{g}(t)=0. $$
Yes. Suppose there is $c>0$ such that $|f(t_k)|>ce^{t_k/2}$ for some sequence $t_k\to \infty$. We may assume that the intervals $I_k=[t_k-1,t_k+1]$ are disjoint. For each $k$, one of the following holds:
Thus, $\int_{I_k} e^{-t}(|f(t)|^2+|f'(t)|^2)\,dt$ diverges.