I found this problem in the 'logic' section of my discrete mathematics textbook (I am currently a freshman majoring in Computer Science). It is a demonstration problem and I feel like I might have solved but I would like to know whether or not the procedure is solid.
Basically, it asks to prove using a reductio ad absurdum that there are no positive integers a and b such as: $$a=b^2-a^2$$
I assumed that such a and b do exist, and the condition $a=b^2-a^2$ is therefore satisfied. I carried $a^2$ to the left side and I got $$a+a^2=b^2$$
Simplifying $$a(1+a)=b^2$$
Then finally $$a=\frac{b^2}{(1+a)}$$
Now, if a is even, $\frac{b^2}{(1+a)}$ is even too, being equal. In order to satisfy this:
- if $b^2$ is even, no issues: since a is even then 1+a is odd and the resulting quotient is an even number;
- BUT if $b^2$ is odd, 1+a can't be even because that would mean going outside the integers set(a belongs to N). The denominator must be an odd number but then the quotient a will be an odd number as well (and I guess this is the first condtradiction because a cannot be both even AND odd);
Second scenario: if a is odd, then $\frac{b^2}{(1+a)}$ is odd too, being equal. In this case:
- if b^2 is even, 1+a must be even and since a is odd, a+1 will be even;
- BUT if b^2 is odd, 1+a must be odd. Another contradiction since both a and a+1 will be odd;
That being said, the demonstration should be done. Feel free to point out any mistakes you might discover. This is also my first question on Mathematics so I will appreciate feedback on how the question was written and the formatting as well.