I first tried with brute force with $-1000 \leq a,b \leq 1000$ but found no solution. But then a simple argument showed me that there was no solution. Not only in the integers, but even for the rationals. What was that argument?
The diophantine equation $a^2+ab-b^2=0$
369 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 6 best solutions below
On
And why is it impossible to solve the equation? $a^2+ab-b^2=0$
$$a=\frac{-b\pm\sqrt{b^2+4b^2}}{2}=b\frac{-1\pm\sqrt{5}}{2}$$
you can see the whole decision no.
If $a^3+ab-b^2=0$ would have acted similarly. $b^2-ab-a^3=0$
$$b=\frac{a\pm\sqrt{a^2+4a^3}}{2}=a\frac{1\pm\sqrt{4a+1}}{2}=a\frac{1\pm{k}}{2}$$
$a$ - to choose such that the root was intact. $a=\frac{k^2-1}{4}$
If $a^2+pab-b^2=0$
$$a=\frac{-pb\pm\sqrt{p^2b^2+4b^2}}{2}=b\frac{-p\pm\sqrt{p^2+4}}{2}$$
There are solutions when: $p=0 $
On
For integers it is easy, if a prime $p$ divides $a$ then it divides $b$ and conversely, so $\frac{a}{b}$ can never be a reduced fraction. For rationals multiply by the denominators to reduce to the integer case.
On
Without loss of generality we can assume that $\gcd(a,b)=1$. If we take the equation modulo $a$ (or b) we get $b^2 \equiv 0 \pmod a$, a contradiction.
On
For rational numbers, simply write $a$ and $b$ with common denominators, and then clear denominators to get an integer solution. So if there is no integer solution, there is no rational solution.
For integers:
$$a^2+ab-b^2=0$$ $$4a^2+4ab-4b^2=0$$ $$4a^2+4ab+b^2=5b^2$$ $$(2a+b)^2=5b^2$$
Left side is a perfect square; right side is not (by Fundamental Theorem of Arithmetic). Contradiction.
On
If there were a rational solution, we could multiply it through by the square of a suitable non-zero integer to get an integer solution, so we may suppose that $a,b \in \mathbb{Z}.$ But now if $c = {\rm gcd}(a,b)$ we can divided through by $c^{2}.$ Hence we may suppose that $a,b \in \mathbb{Z}$ and that ${\rm gcd}(a,b) = 1.$ Now, however, $b^{2} = a(a+b),$ so that $a$ divides $b^{2}$. Hence $a$ divides ${\rm gcd}(a^{2},b^{2}) = 1.$ Thus $a = \pm 1.$ Similarly, $b$ divides $a^{2}$ and $b = \pm 1.$ These two facts together yield a contradiction, as $a^{2} +ab$ is then even and $b^{2}$ is odd.
We will prove this by checking out different cases:
Case 1: Both are odd. In that case $a^2 - b^2$ is even and $ab$ is odd. You cannot get zero then.
Case 2: $a$ is odd and $b$ is even: $a^2 - b^2$ is then odd and $ab$ is even. Again, not possible to get zero
Case 3: Same as case 2 with $b$ being odd and $a$ even.
Case 4: Let $a=2n$ and $b=2m$ then we have $a^2+ab-b^2=0 \Rightarrow 4n^2 +4nm -4m^2=0 \Rightarrow n^2+nm-m^2=0$ and if they're even then you repeat this till you obtain one of the other three cases which means none of the cases are possible.