Decreasing by $\sqrt{n}$ every time

98 Views Asked by At

We start with a number $n>1$. Every time, when we have a number $t$ left, we replace it by $t-\sqrt{t}$. How many times (asymptotically, in terms of $n$) do we need to do this until our number gets below $1$?

Clearly we will need more than $\sqrt{n}$ times, because we need exactly $\sqrt{n}$ times if we replace $t$ by $t-\sqrt{n}$ every time. The recurrence is $f(n)=1+f(n-\sqrt{n})$, but I'm not sure how to solve it.

1

There are 1 best solutions below

0
On BEST ANSWER

To attack this problem, let me first give a good heuristic by passing the problem to a continuous analog which is easier to solve. We will then modify the heuristic to give formal proof of our asymptotic. Firstly, consider what happens if we consider a sequence $a_n$ defined as $$a_{n+1}=a_n-\sqrt{a_n}$$ starting with some fixed $a_0$. We want to know the first $n$ for which this is $1$. Now, we can rewrite this as: $$\Delta a_n = -\sqrt{a_n}$$ where $\Delta$ is the forwards difference operator.

I do this to suggest the following, easier parallel problem: Consider a function $g(t)$ starting with $g(0)=a_0$ and satisfying the differential equation $$g'(t)=-\sqrt{g(t)}$$ where we have merely exchanged our forwards difference for a derivative. The solution to this is easy. In general, the solution is of the form $$g(t)=\left(\frac{1}2t-c\right)^2$$ for $c\geq 0$. Then, we need $c^2=a_0$ to get $g(0)$ to work out so the solution is: $$g(t)=\left(\frac{1}2t-\sqrt{a_0}\right)^2.$$ Which makes it clear that it will take $f$ a duration of $2\sqrt{a_0}$ to reach $0$ (or equivalently, $2\sqrt{a_0}-2$ to reach $1$). Notice that $f$ decreases strictly slower than your sequence, so we're safe on that front. Notice that $$g(t+1)-g(t)=\frac{1}4-\sqrt{g(t)}$$ which is close to the desired decline, but declines too slowly. However, since you've already proven that it takes at least $\sqrt{a_0}$ steps and this provides that it takes at most $2\sqrt{a_0}$ steps, we conclude that the number of steps required is asymptotic to $O(\sqrt{a_0})$.

This implicitly assumes you round square roots up or not at all. However, if you just choose a "fudged" version of $g$ defined as $$g(t)=(\alpha t - \sqrt{a_0})^2$$ you can actually prove that the limit of the number steps taken (regardless of rounding - or any bounded disturbance in the sequence) over $\sqrt{a_0}$ is $2$. To sketch a proof of this first fix some "fudge factor" $k$ telling us the largest change we'll encounter due to rounding (i.e. $k=1$ works for floors or ceilings). Next one considers that if $\alpha>\frac{1}2$ then $g(t+1)-g(t)$ will be less than $-\sqrt{g(t)}-k$ whenever $f(t)$ is large. When $\alpha <\frac{1}2$ then $g(t+1)-g(t)$ will be more than $-\sqrt{g(t)}+k$. Considering that $g$ will reach $1$ at time $\frac{1+\sqrt{a_0}}{\alpha}$ we find that a pair of such functions $g$ for two $\alpha$ around $\frac{1}2$ bound the behavior of the sequence when it remains large (and the sequence only takes a finite number of steps to pass from where it becomes too small for our bounds to apply to reaching $1$ - this finite quantity obviously disappears when we take a limit with $\sqrt{a_0}$ in the denominator).