If $1-\frac{f(x)}x$ converges to $0$ on (finite) iteration, when does $f(x)$ converge on iteration?
Let $f:\Bbb Q\to\Bbb Q$
I will write it more precisely to make plain that I'm considering independent iteration of the two functions:
Let $y_{m+1}=1-\dfrac{f(y_m)}{y_m}$and for all $y_0\in\Bbb Q$ let there be some $p$ such that $y_p=0$
Now let $x_{n+1}=f(x_n)$. Is there, for each $x_0\in\Bbb N$, some $q$ such that $x_{q+1}=x_q$?
If this consequence doesn't follow in general, I would greatly appreciate a better understanding of under what conditions it would do so, or what if anything is implied about $f(x)$.
UPDATE: I have found a counterexample to the question of "if" so what remains, is under what circumstances or extra conditions, or for what values of $x$, $f(x)$ will converge on iteration. This is a question about what the first condition tells us about the Fatou set of $f(x)$ so I've added the chaos theory tag.
Let $ f $ be continuous.
If after $n$ iterations of $ g(x) = 1 - \frac{f(x)}{x} $ converges to $0$ then we consider what happens after $ n + 1$ iterations.
After $ n+1 $ iterations we have $g(0) = 1 - \frac{f(0)}{0} = 1 - f’(0) $.
Since $0$ is an attracting point of $g$ we know $g ‘ (0) < 1$. Also we know that after $2n$ iterations we also get $0$.
Notice that since all $x$ map to the same number after a while , this implies that we are dealing with a function $ g $ that is not analytic unless $n = 1$.
If $ n = 1$ then $f(x) = x$.
In that case $f ‘(0) = 1 $ and iterations of $f$ converges immediately.
After $ n - 1 $ iterations we have $ q$ that satisfies
$ 1 - f(q)/q = 0 $ so $f(q)/q = 1$. This implies that $f$ has a fixpoint ( $q$ ). Since $ q $ does not depend on $x$ but on $ f $ here , yet at the same time is the result of iterations started at $x$ we must conclude that $f(x) = x$ Notice that since $g$ is not analytic , this is the only analytic possibility for $f$. This also explains $ f(0)/0 = f’(0) = 1$ and thus after $ n+1 $ iterations we get again the value $ 1 - 1 = 0 $. Notice that since $q$ does not depend on $x$ we can not meaningfully talk about values of $ n - 2 $ iterations distinct from values of $ 0 , 1 , n-1 , n , n+1 $ iterations if $ n-2 > 0 $. This forces $n = 1$.
This makes sense since for $ n = 1 $ or equivalently $f(x) = x $ we get $g(g(x)) = 0 $ and $ g ‘ (0) = 0 $ as expected for a nonchanging function ( since $ n + 1 $ also gives $ 0 $ ).
Note that any nonlinear $ f $ would make $ g $ analytic , which would be a paradox.
Also note that we know $ n , 2n , n+1 $ give the same value $0$ hence $ n = 1$.
—- —-
perhaps a more interesting question is what happens when the limit of g exists but never approaches it in finite steps. When does that imply $ f $ converges ? I think I know that too.
——
I assume $ f$ needs to be continuous for the limit of $ g $ to be $ 0 $ and actually reaching that value in finite steps.
——
I am aware that the OP probably meant convergence in finite steps ( $n$ ) depending on $x$.
I took it as nondepending.
Consider convergence after $ n $ steps for $x$ and $ m $ steps for $y$ , then both $ x,y$ converge after $ n m $ steps hence also a finite number.
By analogue for $ t $ imput values $ x_1,x_2,...$ we get convergence after the smallest common multiple ( scm) $ scm(x_1,x_2,...) $
Since we consider convergence for ALL imput in a finite number of steps the scm of an INFINITE amount $t$ must still be FINITE , hence I can use $ “ n “ $ for all imput $x$.
Hence the justification.
Notice this also extends to complex numbers.
And - I think - finite groups.