Let $a,b\in\mathbb R_{\geq 0}$ and let $p>1$. Is it true that $$ (a+b)^p\geq a^p+b^p $$ holds? It's obvious for $p\in\mathbb N_{>1}$. I was thinking of making an argument involving derivatives. So for each $p\in\mathbb N_{>1}$ we have that $(a+b)^p-a^p-b^p>0$. De derivative with respect to $p$ then yields $\ln(p)[(a+b)^p-a^p-b^p]>0$. Maybe I can somehow use this fact? In any case, I'm stuck and would appreciate some help.
Is it true that $(a+b)^p≥a^p+b^p$ for $p>1$
111 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 4 best solutions below
On
Write your binomial in the form $a(1+x),$ where $0<x<1.$ Then we have that $$a^p(1+x)^p>a^p(1+px)>a^p(1+x^p),$$ from which the result immediately proceeds.
On
Note that if $a =0$ or $b = 0$ the inequality fails to hold. So we consider $a,b > 0$. Put $\sin^2(\theta) = \dfrac{a}{a+b}, \cos^2(\theta) = \dfrac{b}{a+b}\implies \sin^{2p}(\theta)+\cos^{2p}(\theta) < 1$. But this is true because $\sin^{2p}(\theta)+\cos^{2p}(\theta) < \sin^2(\theta)+\cos^2(\theta) = 1, \theta \in (0,\frac{\pi}{2})$ .
On
We may restrict to the case that $a\in(0,\infty)$ and $b\in(0,\infty)$ because $(a+b)^{p}=a^{p}+b^{p}$ if $a=0$ or $b=0$.
By further dividing the whole expression by $a^{p}$, the inequality is equivalent to $\left[1+(\frac{b}{a})\right]^{p}>1+(\frac{b}{a})^{p}$. This suggests us to investigate the function $f:[0,\infty)\rightarrow\mathbb{R}$, $f(x)=(1+x)^{p}-1-x^{p}$. Note that $f(0)=0$. For $x\in(0,\infty)$, we have $f'(x)=p[(1+x)^{p-1}-x^{p-1}].$ Consider three cases.
Case 1: $p\in(0,1)$. Observe that the function $(0,\infty)\rightarrow(0,\infty)$, $y\mapsto y^{p-1}$ is strictly decreasing, so $f'(x)<0$.
Case 2: $p=1$. In this case, $f'(x)=0$.
Case 3: $p\in(1,\infty)$. Observe that the function $(0,\infty)\rightarrow(0,\infty),y\mapsto y^{p-1}$ is strictly increasing, so $f'(x)>0$.
We conclude that:
If $p\in(0,1)$, $f$ is strictly decreasing on $[0,\infty)$ and hence for any $x\in(0,\infty)$, $f(x)<f(0)=0$.
If $p=1$, $f$ is a constant function and hence $f(x)=f(0)=0$.
If $p\in(1,\infty)$, $f$ is strictly increasing and hence for any $x\in(0,\infty)$, $f(x)>f(0)=0$.
The method using derivatives does not work as $$\ln(p)[(a+b)^p-a^p-b^p]>0\iff (a+b)^p-a^p-b^p>0,$$ as $\ln(p)>0$ when $p>1$. So we are back to the original problem. However, we know that when $a,b>0$, $\left(\frac{a}{a+b}\right)^p<\frac{a}{a+b}$, for any $p>1$ because $\frac{a}{a+b}<1$. Thus, $$ \frac{a^p+b^p}{(a+b)^p} = \left(\frac{a}{a+b}\right)^p+\left(\frac{b}{a+b}\right)^p < \frac{a}{a+b}+\frac{b}{a+b}=1\implies a^p+b^p < (a+b)^p.$$