A question related to the sum-of-divisors function

102 Views Asked by At

In what follows, let $\sigma$ be the sum-of-divisors function, and assume that we have $\sigma(a^b)\sigma(c^2)=2{a^b}{c^2}$ together with $\gcd(a,c)=1$ and integers $a, c > 1$. Let $I(x)=\sigma(x)/x$ denote the abundancy index of the positive integer $x$. We assume further that $a$ is prime.

If $c < a^b \leq \sigma(c) \leq \sigma(a^b)$, then it follows that $b > 1$ and $$1 \leq \frac{\sigma(c)}{a^b} \leq I(a^b) < I(c) \leq \frac{\sigma(a^b)}{c} < 2.$$

My question is this:

Does it follow that $$a^{b - 1} < c?$$

Here is my attempt:

Since $a^b \leq \sigma(c) < 2c$, then we have $a^{b - 1} < \frac{2}{a}\cdot{c} \leq c$, because $a$ is an integer together with $a > 1$ implies that $a \geq 2$.

Is my proof correct?