Relating to $( a ^ b ) + c = d$ and $( a ^ b ) - c = d$

47 Views Asked by At

Is there a way of deducing the smallest integer values for $a, b$ and $c$ that satisfy either

$( a ^ b ) + c = d$

or

$( a ^ b ) - c = d$

such that the addition $( a + b + c )$ is the smallest possible integer?

I am wanting to do this for some very large integer values up to approximately 14 000 000 000 000 digits long. I may have to work with some smaller numbers to start with however.

I could write a computer program to determine some numbers but I was wondering if there is some higher level mathematics involved that could provide a solution please.

1

There are 1 best solutions below

2
On

If $b>1$ and $a^b\approx d$ then $(a\pm1)^b$ differs from $a$ by $\approx ba^{b-1}\approx\frac{bd}{a}$, hence we can achieve $c\approx \frac{bd}a$ or better. For given $b$, we will have $a\approx \sqrt[b]d$, hence $$ a+b+c\approx \sqrt[b]d +b+\frac{bd}{\sqrt[b]d}.$$ In this light, the best choce seems to be $b=2$ as it keeps the last summand small. Anything better than that is "luck", e.g. if $d$ happens to be awfully close to some high (odd) power. Computationally, it is not too problematic to compute $a=\lfloor \sqrt[b]d\rfloor $ for $b=2,3,\ldots$ and determine the $c$ belonging to $a^b$ and $(a+1)^b$ until you reach $a<10$ say. After that try the obvious best choices of $b,c$ for $a=2,\ldots , 9$ accordingly