Is there any nonconstant function that grows at infinity slower than all iterations of the (natural) logarithm?
Is there any nonconstant function that grows (at infinity) slower than all iterations of the (natural) logarithm?
5.6k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 5 best solutions below
On
Yes:
My favorite big-number function from positive numbers to positive numbers, a continuous variant of Old John's.
Write $log_{10}^{(n)} x$ for the $n$th iterated base 10 logarithm of $x$.
If $log_{10}^{(n)} x \in [0,1)$ then let $f(x)=n+log_{10}^{(n)} x$.
You can extend this to non-positive numbers with $f(0)=0$ and $f(-x)=-f(x)$ giving a continuous bijective function on the reals.
If you want a smooth function take natural logarithms base $e$ instead
On
In fact there are functions that go to $\infty$ more slowly than any function you can write down a formula for. For positive integers $n$ let $f(BB(n)) = n$ where $BB$ is the Busy Beaver function. Extend to $[1,\infty)$ by interpolation.
EDIT: Stated more technically, a "function you can write down a formula for" is a recursive function: it can be computed by a Turing machine. $BB(n)$ is not recursive, and grows faster than any recursive function. If $g(n)$ is a recursive (integer-valued, for simplicity), nondecreasing function with $\lim_{n \to \infty} g(n) = \infty$, then there is a recursive function $h$ such that for all positive integers $n$, $g(h(n)) > n^2$. Namely, here is an algorithm for calculating $h(n)$ for any positive integer $n$: start at $y = 1$ and increment $y$ until $g(y) > n^2$, then output $y$.
Now since $BB$ grows faster than any recursive function, for sufficiently large $n$ (say $n \ge N$) we have $BB(n) > h(n+1)$. For any integer $x \ge h(N)$, there is $n \ge N$ such that $h(n) \le x < h(n+1)$, and then (since $f$ and $g$ are nondecreasing) $$f(x) \le f(h(n+1)) \le f(BB(n)) = n < \sqrt{g(h(n)} \le \sqrt{g(x)}$$ and thus $$\dfrac{f(x)}{g(x)} \le \dfrac{1}{\sqrt{g(x)}} \to 0\ \text{as} \ x \to \infty $$
On
Let $\text{LogNestMonster}_i(n) = nest(log, i)(n) = \overbrace{log(log(...(log}^\text{i times}(n))...))$.
Many functions grow slower than $\text{LogNestMonster}_i$, no matter how large the $i$ you picked is:
The inverse Ackermann function $\alpha$ grows slower than all $\text{LogNestMonster}$s. It (roughly) tells you how far in the sequence $1+1$, $2 \cdot 2$, $3^3$, $4\uparrow\uparrow4$, $5\uparrow\uparrow\uparrow5$, $...$ you have to go before exceeding $n$.
The iterated logarithm $log^*$ also grows slower than all $\text{LogNestMonster}$s. It counts how many times you have to apply log to $n$ before the result is less than $1$.
Also, consider the function $\frac{1}{n}$. It is not constant and is clearly asymptotically less than all $\text{LogNestMonster}$s, although I'm guessing it's not exactly what you had in mind since $\frac{1}{n}$ is $O(1)$ despite not being $\Theta(1)$.
Yes, just take a function which is equal to $\log x$ on an initial interval such as $[1,K_1]$, then equal to $\log\log x$ on the interval $(K_1,K_2]$, then $\log\log\log x$ etc. where the values $K_1, K_2, \dots$ are chosen sufficiently large to ensure that the function really does tend to infinity. This can always be done, since each of the iterations of the log function does tend to infinity.
This function will not be continuous, but a similar function could certainly be created which is continuous.
This will produce a function which tends to infinity slower than any (fixed) iteration of $\log$. If you want something even slower, then call my function above $\phi (x)$, and repeat my construction using $\phi\phi(x)$ etc on successive intervals.
If this is still not slow enough, then ...