If $a < b$ and $a > 0$, then $a^2 < b^2.$

63 Views Asked by At

Working on the book: Lange, Serge. "Basic Mathematics" (p. 80, exercise 6).

The proof given by the author, is:

We have $a^2=aa<ab<bb=b^2$, using $$a > b \land c > 0 \to ac>bc$$

I understand that:

  • If I multiply each side of $a<b$ by $a>0$, I get $$aa<ab$$
  • If I multiply each side of $a<b$ by $b>0$, I get $$ab<bb$$

What I do not understand, is:

  • If I multiply each side of $a<b$ by $b<0$, the inequality gets reversed, so I get $$ab>bb$$

The premises only says $a>0$, but $b$ could be negative. The result does not hold in case $b$ is negative. Can someone explain what's happening in this proof ?

3

There are 3 best solutions below

0
On BEST ANSWER

By transitivity of $<$, you can conclude $0 < b$ from the fact that $0 < a$ and $a < b$.

0
On

You can also use the property $u>0,v>0\implies uv>0$.

$b^2-a^2=\underbrace{(b-a)}_{>0}\underbrace{(b+a)}_{>0}>0$

0
On

You start with $a < b$ and $a > 0$.

Therefore $0 < a < b$, which means $0 < b$.

$b$ has to be greater than zero.