Proof verification: $a$ and $b$ that satisfy $\frac{a+b}{a} = \frac{b}{a+b}$ cannot be both real.

187 Views Asked by At

I want to prove that $a$ and $b$ that satisfy

$\frac{a+b}{a} = \frac{b}{a+b}$

cannot both be real. Is the following argument correct:

$\frac{a+b}{a} = \frac{b}{a+b} \quad \leftrightarrow \quad (a+b)^2=ab \quad \leftrightarrow \quad a^2+b^2=-ab $

Let's assume that both $a$ and $b$ are real. Then, in any case, $a^2+b^2$ is positive. Yet for the right side of the equation to be positive, $ab$ must be negative, so $a$ and $b$ have different signs.

We also have the following:

$(a+b)^2 = ab \quad \leftrightarrow \quad a+b = \sqrt{a}\sqrt{b}$

As either $a$ or $b$ is negative, but not both, either $\sqrt{a}$ or $\sqrt{b}$ is imaginary, but not both. So, $\sqrt{a}\sqrt{b}$ is a complex number. This, in turn, means that either $a$ or $b$ must contribute the imaginary part of the complex number $a+b$. Hence, we have a contradiction, and thus it is not possible for $a$ and $b$ to both be real.

I don't trust myself when it comes to proofs, so I'm looking for reassurance that this is correct or constructive feedback if there is a mistake. Thanks! (:

5

There are 5 best solutions below

1
On BEST ANSWER

Your idea is right, just keep it simple. Suppose $a , b \in \mathbb{R}$, then $(a+b)^{2}=ab$ implies $ab\geqslant0$. On the other hand, $a^{2}+b^{2}=-ab$ implies $ab\leqslant0$. So, both the conditions implies $ab=0$. Now, $a\neq0$, so $b=0$ forces $a+b=0$, a contradiction. Therefore, $a,b$ cannot be both real.

1
On

If $t= b/a$, then simplifying we have a quadratic $t^2+t+1=0 $ supplying two complex roots (among three cube roots of unity).

Since $t= b/a$, if $a$ real then $b$ has to be complex, and. if $b$ real then $a$ has to be complex. As you say, both cannot be real at the same time, $ a+b $ is always complex.

0
On

$\frac{a+b}{a} = \frac{b}{a+b} \quad \leftrightarrow \quad (a+b)^2=ab \quad \leftrightarrow \quad a^2+b^2=-ab$

Rearranging and multiplying by $\,(a-b)\,$:

$$ 0 = a^2+ab+b^2 = (a-b)(a^2+ab+b^2) = a^3 - b^3 $$

If $a,b$ are real numbers, $a^3=b^3 \implies a=b$, but the original equality is either undefined if $a=b=0$ or otherwise false if $a=b\ne0$, so no real solutions exist.

0
On

Without using algebraic expansion .


Let

$$ \frac{a+b}{a}=\frac {b}{a+b}=k $$

Since $\thinspace ab=0$ leads to a contradiction, then you have $ab≠0$ and observing that $a+b$ and $\dfrac {1}{a+b}$ have the same sign, this implies that $\dfrac 1a$ and $b$ can not have opposite signs. This follows $\dfrac ba>0$, which means $k=1+\dfrac ba>0\thinspace .$

Terefore, you have :

$$ \begin{align}0<k&=\frac {a+b-b}{a-a-b}=-\frac ab<0\end{align} $$

A contradiction .


Regarding the method you apply, you can also finish your solution by completing the square :

$$ \begin{align}a^2+ab+b^2&=\left(b+\frac a2\right)^2+\frac {3a^2}{4}\\ &>0,\thinspace \forall a≠0\thinspace .\end{align} $$

0
On

Let $$b=xa\Rightarrow\\\frac{a+xa}{a}=\frac{xa}{a+xa}\Rightarrow \\\frac{a(1+x)}{a}=\frac {xa}{a(1+x)}\Rightarrow\\1+x=\frac{x}{1+x}\Rightarrow\\(1+x)^2=x\Rightarrow\\x^2+x+1=0$$

But this equation has no roots in $\mathbb{R}$.