If $u(x)\in L^1(R)$,how to prove $\lim \limits_{t \to 0^+} \left \lVert u(xe^t)-u(x) \right \rVert_{L^{1}(R)}=0$

31 Views Asked by At

I want to prove $\lim \limits_{t \to 0^+} \left \lVert u(xe^t)-u(x) \right \rVert_{L^{1}(R)}=0$,where $u(x)\in L^1(R)$ and $t \in [0,+\infty)$.But I have a problem in my proof.

Because of $u(x)\in L^1(R)$,I know that there exist a subset $A\in R$ which has measure zero.So I want find a function g in $L^{1}(R)$,s.t. $|u(xe^t)|<g(x) $ in $R-A$ when $t \in [0,1]$,then I can use dominated convergence theorem to finish my proof.

I try my best to construct such a function g,but I failed.Can anyone help me?

Actually,I will prove S is a $C_0$-semigroup,where $$S:R_+ \to \mathbf{L}(L^1(R),S(t)u(x)=u(xe^t)$$ If anyone has another way to prove it,please tell me.

1

There are 1 best solutions below

0
On BEST ANSWER

I don't think dominated convergence will take you all the way. Some hints: Prove it first for continuous $g$ with compact support. Here DCT will be useful. Then use the fact that such $g$'s are dense in $L^1$ and end with a standard approximation argument.

Note: $e^t$ is kind of a red herring. Just replace it with $s$ as $s\to 1^+.$