Exponential diophantine: $(a^r+1)(b^s+1)=c^t+1$?

89 Views Asked by At

I've been trying to solve this for a while to no avail.

Problem: Find all integers $a,b,c,r,s,t$ such that $(a^r+1)(b^s+1)=c^t+1$.

(In fact, the problem I was trying to solve had $a^r+1,b^s+1\in \mathbb{P}$ where $\mathbb{P}$ is the set of all primes.)

Any hints are appreciated. (You can also post a solution if you want.)

1

There are 1 best solutions below

6
On

I have an attemp here based on information added in the comments:

The LHS is composed of two primes e and d, so that $ed=c^t+1$ Let us first consider that the exponents are all >1 and that $ (a^2+1) $ and $ (b^2+1) $ are both primes.
Let the lower prime be d (d=e implies that t is even).

We first prove that $c^t$ must be a square. If t is even then c^t can be written as a square.

If t is odd $c^t=(c+1) (c^{t-1}+(-1)^{t-2}c^{t-2}…..+1) $ and

2nd Edit:

We assume that the first factor is d=c+1 and therefore

$a^2+1$ = c+1 and $a^2=c$ and therefore $c^t=(a^2)^t=(a^t)^2$ so that $c^t+1= f^2+1$ with $f=a^t$

We therefore must assume that $ (a^2+1) (b^2+1)=f^2+1$ or

$ (ab)^2+a^2+b^2=f^2$

The factors on the LHV are primes so both a and b must contain factors of 2 and $a^2b^2$ must contain a factor of 16.

Now divide with factors of 2 until at least one term on the LHD is odd.

We therefore have $4(a_1)^2(b_1)^2+a_1^2+b_1^2=f_1^2$ But $a^2$ and $b^2$ must be of the type 4n+1 since 4n (or 0 (mod 4)) is ruled out by our assumptions (and 4n+2 and 4n+3 are ruled out for squares) and the first term must be 0 (mod 4) and therefore f^2 must be of type 4n+2 which is not possible for a square.

What remains - if we disregard the restrictions on exponents and variables - are the trivial solutions for a,b,c=0 etc as mentioned in the comments by Mario Carneiro.