Prove that for positive integers $m, n$ and two positive numbers $p, q$ satisfying $p+q = 1$ we have $(1 − p^n)^m + (1 − q^m)^n \ge 1$.

174 Views Asked by At

Prove that for positive integers $m, n$ and two positive numbers $p, q$ satisfying $p+q = 1$ we have

$$(1 − p^n)^m + (1 − q^m)^n \ge 1.$$

Using binomial theorem, we have $1-C^m_1*p^n+C^m_2*p^2n-\cdots $ for $(1 − p^n)^m$. And same logic for the second term of the equation but they don't cancel each other out.

Is there anything I am doing wrong? (this is a practice from a probability class)

1

There are 1 best solutions below

13
On

You can rewrite $q$ as $1-p$.

$$(1-p^n)^m+(1-(1-p)^m)^n\ge1$$

Since $m,n\ge1$, you can prove your statement true for $n=1$ and $m=1$, because the lowest values of $(1-p^n)^m+(1-(1-p)^m)^n$ occur when $n=m=1$.

The values have a minimum at $1$. Although $p$ might be a really small value, the effects of $p$ are overruled by the effects of $q$, or vise-versa, since $p$ is as far away from $0$ as $q$ is from $1$.

$$1-p+1-1+p\ge1$$

$$1\ge1$$

Therefore, this is true.