Consider two functions $g,h$ with $g\in L^m(\Omega)$ and $h\in L^{l+\epsilon}(\Omega)$, where $\Omega\subset\mathbb{R}^l$ ($l\geq 2$) is some bounded domain, $\epsilon>0$ is small (for simplicity we may assume $\epsilon<1$) and $m>>1$ ($m$ can be chosen as large as we like as long as it is finite). With this information at hand, I need to bound the following integral term $$\int_{\Omega}|g|^{m-2}|h|^2 dx$$ in terms of $|f|^{l+\epsilon}$ and $|g|^{m}$ modulo some constants that are not important in this context, i.e. some inequality of the following form is what I hope for $$\int_{\Omega}|g|^{m-2}|h|^2 dx \leq C\int(|g|^m + |h|^{l+\epsilon}+1) dx.$$
I assume Young's inequality may be applied with a clever choice of exponents, however I was not able to do so successfully until now. I thought about some interpolation theorems but also had no success with those.
Maybe somebody has got an idea on how to derive an estimate of the preceding form, or (in case such an estimate in fact cannot hold true) give a reason why it fails in general. Thanks in advance for any help provided!
Edit: I forgot to mention that I also thought about applying Sobolev's inequality, but this way I also did not achieve the right estimate.