Prove that $\frac{\langle f^2,g\rangle_{L^2}}{\left\|f\right\|_{L^2}^2}\ge-\left\|g\right\|_{L^\infty}$ for $f\in L^2$ and $g\in L^\infty$

28 Views Asked by At

Let $\Omega\subseteq\mathbb{R}^n$ be bounded, $f\in L^2(\Omega)$ and $g\in L^\infty(\Omega)$. How can we show, that $$\frac{\langle f^2,g\rangle_{L^2(\Omega)}}{\left\|f\right\|_{L^2(\Omega)}^2}\ge-\left\|g\right\|_{L^\infty(\Omega)}\tag{1}$$ is true? I tried to use the Cauchy-Schwarz and Hölder inequality but I wasn't able to obtain $(1)$.