This question is from one of the steps of the Proof of Proposition B.1 in Appendix B of P. H. Rabinowitz's "Minimax Methods in Critical Point Theory with Applications to Differential Equations."
Let $u \in L^r(\Omega), r,s \ge 1,$ and $a_1,a_2 \ge 0$. How do you prove the following inequality? (We can choose $a_3$ to be an arbitrary constant.) $\int_{\Omega} (a_1+a_2|u|^{r/s})^s dx \leq a_3 \int_{\Omega} (1+|u|^r)dx$.
We will use the following inequality: $$\forall a,b\ge 0, s\ge 1: \, (a+b)^s\leq 2^{s-1}(a^s+b^s)$$ This is easily proved by noting that the function $f(t)=t^s$ is convex for $t\ge 0$ and therefore you have $$f(\frac{1}{2}a+\frac{1}{2}b)\leq \frac{1}{2}f(a)+\frac{1}{2}f(b)\Leftrightarrow \big(\frac{a+b}{2}\big)^s\leq \frac{1}{2}a^s+\frac{1}{2}b^s$$ $$\Leftrightarrow (a+b)^s\leq 2^{s-1}(a^s+b^s)$$ Now you apply this inequality for $1$ and $|u(x)|^{r/s}$ pointwise for every point $x$ in $\Omega$ and use the monotonicity of the integral. $$\int\limits_{\Omega}{(a_1+a_2|u|^{r/s})^sdx}\leq \int\limits_{\Omega}{\max{\{a_1,a_2\}^s(1+|u|^{r/s})^s}dx}\leq \max{\{a_1,a_2\}}^s\int\limits_{\Omega}{2^{s-1}(1+|u|^r)dx}=a_3\int\limits_{\Omega}{(1+|u|^r)dx}$$ where $a_3:=\max{\{a_1,a_2\}}^s2^{s-1}$