Characterization of extendible distributions

78 Views Asked by At

I found the following question which characterize the extension of a distribution in $\mathbb{R}$:

Let $f \in L_{\text{loc}}^{1}(\mathbb{R}_{>0})$ such that $f \geq 0$ a.e. Show that $f$ extends to a distribution $F$ in $\mathbb{R}$ if and only if there exists $k \geq 0$ $$\int_{\varepsilon}^{1}f(x)dx=O(\varepsilon^{-k}),$$ for $\varepsilon \rightarrow 0^{+}$.

I would like to know if someone could give me some hint to solve this problem. The only thing I could think was evaluate the distribution $T_f$ generated by $f$ in functions like $\varphi_\varepsilon(x)$ satisfying $\operatorname{supp}(\varphi_\varepsilon) \subset (\varepsilon,1)$, but without success.

Another question I have is:

Is there any generalization of this result to higher dimensions?