According to what rule is $\frac{a+b}{a} = \frac{b}{a+b} = \frac {a+b-b}{a-a-b}=-\frac {a}{b}$ true and why?

101 Views Asked by At

I want to prove that $a$ and $b$ that satisfy

$$\frac{a+b}{a} = \frac{b}{a+b}$$

cannot both be real.

Here is the natural solution.

$$(a+b)\times (a+b)=ab$$ $$a^2+2ab+b^2=ab$$ $$a^2+ab+b^2=0$$

$a=0$ makes denominator zero. Hence $a\ne 0$, dividing by $a^2$ and put $\frac {b}{a}=t$ we get $t^2+t+1=0$ . But, discrimimant is negative $D=1-4=-3$. So there are not real solutions.

I just have some questions about how the following "bogus" solution works. Of course I'm not sure.I copied the latex text and added it here.

Let

$$\frac{a+b}{a} = \frac{b}{a+b}=k$$

Since $\thinspace ab≠0\thinspace $ and $\thinspace a+b\thinspace $ exists on both sides of the equation with the same sign, then $\thinspace a\thinspace $ and $\thinspace b\thinspace $ can not have opposite signs. This implies that, $\thinspace ab>0\thinspace $ or $\thinspace \dfrac ab>0\thinspace$, which means $\thinspace k>0\thinspace .$

Terefore, you have :

$$ \begin{align}0<k&=\frac {a+b-b}{a-a-b}=-\frac ab<0\end{align} $$

A contradiction .

How can $ab > 0$ without knowing the sign of $a+b$ ? My other question is, according to what rule is $\frac{a+b}{a} = \frac{b}{a+b}=\frac {a+b-b}{a-a-b}=-\frac {a}{b}$ true and why?

Thank you for references.

4

There are 4 best solutions below

1
On

Your solution makes a couple of leaps of logic that need more justification, i.e. you are a bit hand-wavy.

$$\frac{a+b}{a} = \frac{b}{a+b}=k$$

Since $\thinspace ab≠0\thinspace $ and $\thinspace a+b\thinspace $ exists on both sides of the equation with the same sign, then $\thinspace a\thinspace $ and $\thinspace b\thinspace $ can not have opposite signs.

This is confusing. First off, $ab\neq 0$ has not been established yet (although it is true). But yes, the general gist is that the sign of $a$ and $b$ must be equal, however I would simply write that if you just look at the signs of both sides of the equation, you get $$\mathrm{sgn}(a+b)\cdot \mathrm{sgn}(a)=\mathrm{sgn}(a+b)\cdot\mathrm{sgn}(b)$$ which means $\mathrm{sgn}(a)=\mathrm{sgn}(b)$.


However, your major mistake comes later on, when you write:

$$0<k=\frac {a+b-b}{a-a-b}=-\frac ab<0$$

In this inequality, you wrote down the equation $k=\frac {a+b-b}{a-a-b}$, yet you have zero explanation as to where that equation comes from. You defined, originally, that $k=\frac{a+b}{a}$ (and equivalently, that $k=\frac b{a+b}$). You then do not mention $k$ at all, until it appears all of a sudden in your argument in the equation $k=\frac {a+b-b}{a-a-b}$. Where did this come from?

0
On

How can $ab > 0$ without knowing the sign of $a+b$?

I suspect the reasoning there was consideration of the cases of the possible signs of the expressions. Like if $a+b>0$ then the sign of $a$ is the same as the sign of $\frac{a+b}{a}$ and the sign of $b$ is the same as the sign of $\frac{b}{a+b}$, and since these are equal, $a$ and $b$ have the same sign. If $a+b<0$ ends up leading to the same result.

The easier way to see this is that the original equation implies $ (a+b)^2 = ab $. We know $a+b \ne 0$, so if $a$ and $b$ are real, then $ab = (a+b)^2 > 0$.

According to what rule is $\frac{a+b}{a} = \frac{b}{a+b}=\frac {a+b-b}{a-a-b}=-\frac {a}{b}$ true and why?

From the answer of JBL:

if $\frac{a}{b} = \frac{c}{d}$ and $b \neq d $ then both are equal to $\frac{a - c}{b - d}$.

0
On

The step that the other two answers don't understand is the following basic fact of fraction arithmetic: if $\frac{a}{b} = \frac{c}{d}$ and $b \neq d $ then both are equal to $\frac{a - c}{b - d}$.

4
On

The second solution isn't bogus. In fact it's slightly harder to figure out.

$$\dfrac{a+b}{a}=\dfrac{b}{a+b}=k$$ Here after cross multiplication we see, $$(a+b)^2=ab$$ Now $(a+b)^2 \gt 0$, since if $$(a+b)^2=0$$ $$\implies ab= 0 $$ $$\implies a=0 ; b= 0$$ But, $a \not = 0$ and $b\not =0$ as $k$ would become undefined. $$\therefore (a+b)^2 \gt 0 $$ $$\implies ab\gt 0 \implies \dfrac ab \gt 0\implies k\gt 0$$ (Since $a$ and $b$ must be of same sign)

Now for the tricky step, $$\dfrac{a+b}{a}=\dfrac{-b}{-(a+b)}=\dfrac{a-b}{a-a-b}=-\dfrac {a}{b} \lt 0\implies k \lt 0$$

(Note : $\dfrac ab=\dfrac cd = \dfrac {a+c}{b+d}$ $a,b,c,d \in \mathbb {R}$ and $b+d\not = 0$)

Hence we arrive at a contradiction that $k$ cannot $\gt 0$ and $\lt 0$ simultaneously.

Therefore $a$ & $b$ cannot be real.