In his paper Splay Trees, Davenport-Schinzel Sequences, and the Deque Conjecture, Seth Petrie proves that a particular series of operations on a splay tree take amortized time $\alpha^*(n)$, where $\alpha*(n)$ is the number of times that the Ackermann inverse function $\alpha$ must be applied to $n$ to drop it to a constant.
The Ackermann inverse function grows so absurdly slowly that I can’t even conceive of how slowly growing $\alpha^*(n)$ is. The best I’m able to do for the Ackermann function is to see it written out in Knuth’s up-arrow notation, then to reason that $\alpha$ grows as slowly as the Ackermann function grows quickly. However, I have no conception of what the inverse of $\alpha^*$ is, so this approach hasn’t really panned out for me.
Is there a way to quantitatively reason about how $\alpha^*$ fits into the hierarchy of slowly-growing functions?
It's essentially like the iterated logarithm but for the Ackermann inverse function. Alternatively, you could try to iterate the Ackermann function and then take the inverse:
$$f(n+1)=\operatorname{Ack}(f(n),f(n))\Rightarrow n\approx\alpha^\star(f(n))$$
Since $g_{n+1}\approx\operatorname{Ack}(g_n,g_n)$, where $g_{64}$ is Graham's number, we roughly have $n\approx\alpha^\star(g_n)$.
This is unclear. What is meant by "hierarchy of slow-growing functions"? In my honest opinion, the best way to understand these things is to understand them as the inverses of fast growing functions like above.
Ultimately, you need some basis from which to extend your intuition, but these functions simply grow faster than anything you would usually encounter. So my point is that there really isn't a good way to explain this kind of thing, since anything I could point you to will necessarily be something "just as absurd".
Personally, if you were to ask me how I would try to conceptualize functions that grow this fast, I would lead you to the $m(n)$ map, or as I like to call them, iteration functions:
$$\operatorname{Iter}_1=f\mapsto n\mapsto\underbrace{f(\dots f(}_nn)\dots)$$
Let $s=n\mapsto n+1$ be the successor function. Then we have, for example,
$$\operatorname{Iter}_1(s)(n)=n+1+\dots+1=2n$$ $$\operatorname{Iter}_1(\operatorname{Iter}_1(s))(n)=2\times\ldots\times2n=2^nn$$ $$\operatorname{Iter}_1(\operatorname{Iter}_1(\operatorname{Iter}_1(s)))(n)\approx\operatorname{Ack}(4,n)$$
Essentially all we're doing is just iterating the function that gets input. Compare this to the curried Ackermann function:
$$\operatorname{Ack}(m+1)=n\mapsto\underbrace{\operatorname{Ack}(m)(\dots\operatorname{Ack}(m)(}_{n+1}1)\dots)$$ $$\operatorname{Ack}(m+1)\approx\operatorname{Iter}_1(\operatorname{Ack}(m))$$
from which you should be able to see the close similarity. The same is true with Knuth's up-arrows:
$$a\uparrow^{m+1}\approx\operatorname{Iter}_1(a\uparrow^m)$$
if you thought of these as functions that just repeatedly iterate other functions.
We then have higher order iteration defined as follows:
$$\operatorname{Iter}_2=f\mapsto g\mapsto n\mapsto\underbrace{f(\dots f(}_ng)\dots)(n)$$
from which you can get things such as:
$$\operatorname{Ack}(n,n)\approx\operatorname{Iter}_2(\operatorname{Iter}_1)(s)(n)$$
which is simply what happens if you iterate over the amount of times we iterate. The inverse of your $\alpha^\star$ would then be something along the lines of:
$$n\approx\alpha^\star\left(\operatorname{Iter}_1(A)(n)\right)$$
where $A=\operatorname{Iter}_2(\operatorname{Iter}_1)(s)$ is the previous function being iterated.
Hopefully this is the kind of intuition you wanted.