Consider a sequence of functions $(G_n(t))$ on $\Bbb{R}$ that satisfies the recurrence relation
$$ G_0(t) = e^t, \qquad G_n(t) = e^t \left( 1 + \frac{G_{s-1}(t) - 1}{r} \right)^{r}. $$
for some absolute constant $r > 1$. (Or you can assume that $r = 2, 3, \cdots$ since I am essentially only interested in such $r$'s.) Fix any constant $a > 1$. I am interested in the behavior of $(t_n)$, where it is defined as the unique solution of
$$ G_n(t_n) = a, \quad t_n > 0. $$
Then my question is as follows:
Q. What is an asymptotics of $t_n$ as $n \to \infty$?
For my purpose, an ideal situation would be that $(t_n)$ has a asymptotics of the form $t_n \sim \alpha \log n / n^2$, but even an asymptotics of the form $t_n \geq \alpha \log n / n^2$ would be nice. Here are some observations on $G_n$ and $t_n$:
For any $t \geq 0$ we can inductively check that $$ \exp\{(n+1)t\} \leq G_n(t) \leq \exp \Big\{ \tfrac{r^{n+1}-1}{r-1} t \Big\}. $$ Moreover, these bounds are not arbitrary, in the sense that for each $n \geq 0$ we have $t^{-1}\log G_n(t) \to n+1$ as $t \to 0^+$ and $t^{-1} \log G_n(t) \to (r^{n+1} - 1)/(r-1)$ as $t \to \infty$.
Using a very indirect method, I can prove that $1 \leq G_n(\epsilon n^{-2}) \leq 1 + Cn^{-1}$ holds for some $\epsilon > 0$ and $C > 0$. This implies that $(t_n)$ does not decay faster than $n^{-2}$.
Aagain, using a very indirect heuristics, I suspect that $t_n \leq C \log n / n^2$.
Remark. It is good to read @Did's comment for a probabilistic background of the quantity $G_n(t)$.