Prove that if $a<1/a<b<1/b$ then $a<-1$

2.8k Views Asked by At

The following is Exercise 3.2.8 from Velleman:

Suppose that $a$ and $b$ are nonzero real numbers. Prove that if $a<1/a<b<1/b$ then $a<-1$.

I solved it using the hint in the back of the book but I am certain that there is a more elegant solution.

The hint was: assume $a<1/a<b<1/b$, prove that $a<0$, use it to prove $a<-1$.

2

There are 2 best solutions below

2
On BEST ANSWER

[There isn't really a short way if you want to do it rigorously. Here is a straightforward method.]

Given nonzero reals $a,b$ such that $a < \frac{1}{a} < b < \frac{1}{b}$:

  $a a^2 < \frac{1}{a} a^2$.

  Thus $a^3 < a$.

  Thus $(a+1)a(a-1) < 0$.

  Thus $a < -1$ or $0 < a < 1$.

  Similarly $b < -1$ or $0 < b < 1$.

  If $0 < a < 1$:

    $\frac{1}{a} > 1 > b$.

    Contradiction.

  Therefore $a < -1$.

13
On

Since $a<b$ and $a^{-1}<b^{-1}$, we must have $ab<0$. (Why?)
Hence, $a<0<b$.

Now from $a<a^{-1}$, we have $a^2>1$. Hence, $a<-1$.