I have been struggling with this problem; it should just use some basic inequalities, but having difficulty getting them in the right order.
Let $f \in L^2(\mathbb{R})$ such that it is also the case that $g = xf(x) \in L^2(\mathbb{R})$.
Show $||f||_1 \leq \sqrt{2}(||f||_2+||g||_2)$. It certainly looks like Minkowski but I can't figure out the right first step. If anybody could even get me on the right track I'd be really appreciative!
You need to cut the study in two :
On $A=[-1,1]$, you have $\|f\|_{1,A}^2 \leq 2\|f\|_{2,A}^2 \leq 2\|f\|_2^2$ (Jensen inequality)
On $A^c=[-1,1]^c$, you have
$$\|f\|_{1,A^c} = \left\| \frac{g}{x} \right\|_{1,A^c} \leq \|g\|_{2,A^c}\|\frac{1}{x}\|_{2,A^c} \leq \sqrt{2} \|g\|_2$$
Hence
$$\|f\|_1 \leq \sqrt{2}(\|f\|_2+\|g\|_2)$$