I'm curious if there is a general method to computing the "iterator function" (not sure if this is the correct name, but is what I am calling it) of a recurrence relation.
Specifically, given a recurrence relation:
$$T(n) = \begin{cases} h(1) & n = 1\\ T(f(n)) + h(n) & \mathrm{otherwise} \end{cases}$$
I'm interested in find a function $g(k)$ such that $g(k) = n$ and $f(g(k)) = g(k-1)$. With this new function we can create an equivalent recurrence $S(k) = T(g(k))$: $$S(k) = \begin{cases} h(1) & g(k) = 1\\ S(k-1) + h(g(k)) & \mathrm{otherwise} \end{cases}$$
Alternatively you could consider that I'm looking for a function $g(k)$ such that $g(k-i) = f^{(i)}(n)$ where $f^{(i)}(n)$ is the application of $f$ on $n$, $i$ times (e.g. $f^{(3)}(n) = f(f(f(n)))\ $).
This can then be easily be converted into a summation: $$T(n) = S(k) = \sum_{k = g^{-1}(1)}^{g^{-1}(n)} h(g(k))$$
The idea comes from my other answer over here. I am not sure of a general approach to finding the iterator $g(k)$. I have mostly found them through trial an error.
A simple example is:
$$T(n) = \begin{cases} 1 & n = 2\\ T(\sqrt{n}) + \log \log n & \mathrm{otherwise} \end{cases}$$
Then we have the iterating function be $g(k) = 2^{2^k}$ and then this translates into an equivalent recurrence $S(k) = T(2^{2^k}) = T(n)$:
$$S(k) = \begin{cases} 1 & k = 1\\ S(k-1) + k & \mathrm{otherwise} \end{cases}$$
Then we can easily see $S(k) = \sum_{i = 1}^k i = \frac{k(k+1)}{2}$. We then work backwards to determine $T(n)$ as: $$\begin{align*} T(n) & = T(g(k))\\ & = S(k)\\ & = \frac{k(k+1)}{2}\\ & = \frac{g^{-1}(n)(g^{-1}(n) + 1)}{2}\\ & = \frac{\log \log n (1 + \log \log n)}{2} \end{align*}$$
You can also assume $f(n)$ is monotone decreasing.
Choosing a proper $g(\cdot)$ we have
$$ T(g(g^{-1}(n))) = T(g(g^{-1}(f(n))))+h(n) $$
so with $T' = T(g())$ follows
$$ T'(g^{-1}(n))=T'(g^{-1}(f(n)))+h(n) $$
now making $k = g^{-1}(n)$ we can consider two trivial soluble cases.
$$ g^{-1}(f(n)) = \begin{cases}g^{-1}(n)+m\\ mg^{-1}(n)\end{cases} $$
giving
$$ T'(k) = T'(k+m) + h(g(k))\\ T'(k) = T'(mk)+h(g(k)) $$
The second case can be easily reduced by choosing $g^{-1}(k) = \log_m k$ giving
$$ T''(u) = T''(u+1) + h(g(m^u)) $$
with $u = \log_m k$
In any of those two cases, the return to $T$ is trivial