Let $a$ and $b$ be positive integers such that $ab + 1$ divides $a^2 + b^2$. Show that $$\frac{a^2 + b^2}{ab+1}$$ is the square of an integer.
I have a few questions about the proof. First here it is ,
Assume $$\frac{a^2 + b^2}{ab+1}=k$$ is not a perfect square. Rearranging, we get a quadratic in $a$ $$a^2 −kb·a+(b^2 −k)=0. \quad (*)$$ Clearly $(*)$ has two solution the first one is $a$ so let the second be $s$. Then by Vieta we have $$s=kb-a=\frac{b^2-k}{a}$$ The first equation shows $s$ is an integer, and from $*$ $s$ is positive.
Now since $*$ symmetric we can assume $a>b$ Ignore $a=b$ because it gives a perfect square. Hence $s<a$ so we have a descent. Right?
In my book the author said this So, do we have a descent? If we repeat the process of (s, b), we would get a quadratic in s, and we pick the second root. Do you see an issue? But I don’t really see any issue.
Basically, we assumed that there is a smallest positive integer $a$ such that $k$ is not a perfect square. Using the quadratic you gave, if we have a root such that $a$ is not equal to $b$, then we will always be able to find a root smaller than $a$ that fits the requirements. However, this contradicts our assumption that $a$ was minimal. Thus, $k$ must be a perfect square.
This is similar to the method of descent, where we will always find a smaller root that fits the requirements. However, since there is a smallest positive integer, it is not possible to do this over and over, which shows that there are no solutions with $k$ not a perfect square.