Hölder's Inequality in Weak Lebesgue Space

1k Views Asked by At

For all $1\leq p<\infty$, let $L^{p,\infty}$ denotes weak-Lebesgue space and $$\|f\|_{L^{p,\infty}} := \sup_{\gamma>0}\gamma|\{x \in \mathbb{R}^d:|f(x)|>\gamma\}|^{\frac{1}{p}}.$$ Suppose $1<p,q,r<\infty$ and $\frac{1}{r} = \frac{1}{p} + \frac{1}{q}$. If $f \in L^{p,\infty}$ and $g \in L^{q, \infty}$, then there is constant $C>0$ such that $$\|fg\|_{L^{r,\infty}} \leq C \|f\|_{L^{p,\infty}} \|g\|_{L^{q,\infty}}.$$

I only know we can bound $$|\{x \in \mathbb{R}^d:|fg(x)|>\gamma\}| \leq |\{x \in \mathbb{R}^d:|f(x)|>\gamma\}|+|\{x \in \mathbb{R}^d:|g(x)|>1\}|$$ but I can't figure out what's next. Any help would be appreciated.

1

There are 1 best solutions below

1
On

First, observe that by scaling, we can assume that $\|f\|_{L^{p,\infty}}=\|g\|_{L^{q,\infty}}=1$. Now I think the decomposition we want to try is along the lines of $$ |\{|fg(x)| > \gamma\}| \leq |\{|f(x)| > c\gamma\}| + |\{|g(x)| > 1/c\}|, $$ where we are free to choose $c$ to depend on $\gamma$. To get a reasonable bound it would make sense to have our upper bounds on both terms in the sum match. The bounds we have are $$ |\{|f|>c\gamma\}| \leq (c\gamma)^{-p} $$ and $$ |\{|g|>1/c\}| \leq c^{q}. $$ To make these bounds match, we can choose $c = \gamma^{-p/(p+q)}$. This leads us to the bound $$ |\{|fg(x)|>\gamma\}| \leq 2\gamma^{-pq/(p+q)} = 2\gamma^{-r}, $$ as desired.

It is possible to make this argument work without the scaling assumption at the beginning, it's just that the choice of the constant $c$ becomes more complicated. So the only thing you were missing is the idea that you have a bit more freedom about how to split the product.