Use the interpolation theorem to estimate the $L^p$ norm of f(x) when $p>2$.

86 Views Asked by At

The maximum of the function $\displaystyle f(x)=\frac{\sin(x)}{x}$ is $1$ and $\displaystyle \int_{-\infty}^\infty(\frac{\sin(x)}{x})^2 dx= \pi$. Use the interpolation theorem to estimate the $L^p$ norm of $f(x)$ when $p>2$.

Through another problem I found that $f(x) \in L^p(R)$. But I am unsure how to use the interpolation theorem in this problem.

1

There are 1 best solutions below

0
On BEST ANSWER

The information you are given says that $\lvert\lvert f\rvert\rvert_{L^{\infty}(\mathbb{R})}=1$ and $\lvert\lvert f\rvert\rvert_{L^{2}(\mathbb{R})}=\sqrt{\pi}$. Let $\frac{1}{p_{t}}=\frac{1-t}{2}+\frac{t}{\infty}=\frac{1-t}{2}$. We use the log convexity of $L^{p}$ norms (sometimes referred to as an interpolation result) to conclude:

$$\lvert\lvert f\rvert\rvert_{L^{p_{t}}(\mathbb{R})}\le\lvert\lvert f\rvert\rvert^{1-t}_{L^{2}(\mathbb{R})}\lvert\lvert f\rvert\rvert^{t}_{L^{\infty}(\mathbb{R})}=(\sqrt{\pi})^{1-t}$$