Proving an inequality related to ratio of norms to measures

41 Views Asked by At

Let $\mu$ be a finite measure on a measurable space $(X, \mathcal{F})$ and let $1 \leq r < s < +\infty$. Prove that for every $f \in L^{s}(X)$, it holds that

$$\frac{||f||_{L^{r}(X)}}{\mu (X)^{1/r}} \leq \frac{||f||_{L^{s}(\Omega)}}{\mu (X)^{1/s}} $$

I am studying for an exam, and this is a previous year's problem. I really have no clue how to approach this problem since I am quite new to $L^{p}$ spaces. I've been working with the definitions, and many of the theorems are confusing to me. I would greatly appreciate any help in approaching this problem. By the way, this is a question from a probability exam so $\Omega$ denotes a sample space.

1

There are 1 best solutions below

4
On BEST ANSWER

This is just Holder's in equality: $\int |f |^{r} d\mu =\int (1) (|f |^{r}) d\mu \leq (\int 1^{p} d\mu)^{1/p} (\int |f|^{rq}d\mu)^{1/q}$ where $q=\frac s r$ and $p =\frac q {q-1}$.

This gives $\|f\|_r \leq (\mu(X))^{\frac 1 {pr}}) (\|f_s||)$ since $qr=s$. Now $\frac 1 {pr} =\frac {s-r} {sr}=\frac 1 r-\frac 1 s$. Divide both sides by $(\mu(X))^{1/r}$ to finish the proof.