It's written in a article that:
If $f\in L^p$, $p>2$, then $f=f_1+f_2$, where $f_1\in L^2$, $f_2\in L^\infty$, and $||f_1||_{L^2}\leq 2||f||_{L^p}$, $||f_2||_{L^\infty}\leq 2||f||_{L^p}$.
But I don't know how to prove it. Could I ask for some directions for proving it, or if it's really common, a reference for this theorem? Thanks a lot.
Actually you don't even need the factor of $2$ there.
Indeed, take $f_1=f \cdot 1_{\{x:\;|f(x)|> \| f\|_p\}}$ and $f_2=f \cdot 1_{\{x:\;|f(x)| \leq \|f\|_p\}}$.
Clearly, $\|f_2\|_{\infty} \leq \|f\|_p$ by definition.
On the other hand, if $a,b$ are positive real numbers with $a>b$ then $a/b>1$ so (since $p>2$) we see that $(a/b)^p>(a/b)^2$ which means that $a^2 < b^{2-p}a^p$. Thus if $|f(x)|>\|f\|_p$ then applying this inequality with $a=|f(x)|$ and $b=\|f\|_p$ we see that $|f(x)|^2 < \|f\|_p^{2-p} |f(x)|^p$. Hence $$\|f_1\|_2^2 = \int |f_1|^2 \; d\mu \leq \int \|f\|_p^{2-p} |f|^p \; d\mu = \|f\|_p^2$$ which implies that $\|f_1\|_2 \leq \|f\|_p$.