I'm presented with the equation $\frac{a+b}{a} = \frac{b}{a+b}$
Performing cross multiplication yields $a^2+2ab+b^2 = ab$
Subtracting $ab$ from both sides, we get $a^2+ab+b^2 = 0$
Multiplying both sides by (a-b) and simplifying:
$(a-b)(a^2+ab+b^2) = 0 * (a-b)$
$a^3 - b^3 = 0$
$a^3 = b^3$
$a = b$
Substituting into the original equation, we finally arrive at $2 = \frac{1}{2}$ ... something has obviously gone terribly wrong. Where did I mess up?
Also it is worth noting that the original problem explicitly states that $a ≠ b$
You multiplied with $0=a-b$. That is wrong. You are just recovering this solution. Just to give you a similar example, say you have $$x-1=0$$ If I multiply both sides by $x+1$, I get $$x^2-1=0$$ or $$x=\pm 1$$Obviously $x=-1$ is not a solution. You introduced it when you multiplied with $x+1$.