Is it possible to prove that for any natural $a,b$ the value of $a/b+b/a$ will not be natural with exception $a=b$?
2026-04-12 02:00:35.1775959235
On
On
Prove $a/b+b/a$ for $a$ and $b$ natural is only natural for $a=b$
210 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
4
There are 4 best solutions below
1
On
That's easy. $\frac{a}{b}+\frac{b}{a} = \frac{ a^2 + b^2 }{ab}$ So that $a/b+b/a$ is a natural number means $ a^2 +b^2 = k \cdot ab $ for some natural number k. We know the equation $x^2 -kx +1 =0$ has two integer roots if and only if $k=2$. To satisfy the condition, a should be equal to b.
6
On
Suppose $a$ and $b$ relatively prime (if they are not, just simplify the fractions until they are). We have $$\frac{a}{b} + \frac{b}{a} = \frac{a^2 + b^2}{ab}$$ you're asking that $a^2 + b^2 = kab$ for some $k$ natural.
This would mean that $b^2 = a(kb-a)$ is divided by $a$, which is absurd since $b$ isn't. The only possibility is $a=b=1$, which implies $a=b$ if we don't assume they are relatively prime.
We have $a^2+b^2=kab,\,k\in\mathbb{N}$. Define $d:=\text{gcd}\left(a,\,b\right),\,a':=\tfrac{a}{d}\,b':=\tfrac{b}{d}$ so $a'^2+b'^2=ka'b'$. If $p$ is a prime factor of $a'$, then $p|b'^2=ka'b'-a'^2$ so $p|b'$. But $\text{gcd}\left(a',\,b'\right)=1$, so no prime divides $a'$. Hence $a'=1$; similarly $b'=1,\,a=b=d,\,k=2$.