Is there an O(1) solution for finding the number of times to apply an simple iterated function to satisfy an inequality?
For example, if the function is
$$f(n) = 0.5n - 10; n > 100$$ and we want to solve $$\min (i \ge 0: f^i(n)\le 100)$$
Is there an $O(1)$ solution for this?
I don't think there's a good general theory that you can throw any function at and get good results.
In this case, however, the trick is to change the variable such that the function becomes a simple multiplication. We can do that by setting $m=n+a$, so our original iteration $$ n \mapsto 1.2 n - 10 $$ becomes $$ m \mapsto 1.2 (m-a) - 10 + a $$ which rearranges to $$ m \mapsto 1.2m + (a - 10 - 1.2a) $$ Here we want to constant term to be 0; solving $a-10-1.2a=0$ yields $a=-50$, so with $m=n-50$ we have $$ m \mapsto 1.2 m $$
Then $n = 100 $ corresponds to $m = 50 $, so the $i$ for which $1.2^i m = 50$ must be $\log_{1.2}(50/m)$.
However, it does not really make sense to ask for the first $i$ for which $f^i(n)\le 100$, because the iteration causes the point to diverge away from $n=50$ -- so unless the initial $n$ is already less than $100$, no amount of iterating the function will ever make it become less than $100$.