Is there a closed form formula for the recursive sequence: $x_n = x_{n-1} + \alpha\sqrt{x_{n-1}}$

440 Views Asked by At

I saw this link for closed form formula for a recursive sequence (How to derive a closed form of a simple recursion?)

However, what if my formula is: $$x_n = x_{n-1} + \alpha\sqrt{x_{n-1}} \quad \text{ and } \quad x_0 = \beta \quad \text{ and }\quad\alpha,\beta> 0$$

Is there a closed form solution to determine the value of $x_n$ for a given $n$?

Bonus: If there is a closed form solution, is there an inverse? That is, if one is given a value $y$, can one deduce the closest $n$ such that $x_n$ is the closest to $y$ compared to any other possible $x_n$ value?

I couldn't figure out the mathemical formula for either question. Since we're CS engineers, we know that we could create a lookup table for the first 1 million values of $n$ and that should work for us. (It's arguably a reasonable solution, since we need to compute $x_n$ for 100 million values of $n$ (so obviously, $n$ will be the same thousands of times and so precomputing is not such a bad idea.) But still, it would "cleaner" if there were a closed form analytical solution to the above recursive sequence that we could use/consider.

4

There are 4 best solutions below

12
On BEST ANSWER

The bonus question turns out to be easier to solve than the actual problem. Note the problem can be rewritten as

$$x_{n+1}-x_n=\alpha\sqrt{x_n}$$

Consider a variation on this:

$$\frac{dy}{dt}=\underbrace{\lim_{h\to0}{y(t+h)-y(t)\over h}}_{\approx~y(t+1)-y(t)}=\alpha\sqrt{y(t)}$$

This is easily solved using calculus and gives

$$y(t)=\left(\frac{\alpha t}2+\sqrt{y(0)}\right)^2$$

Thus, the solution to your problem is approximately given by

$$x_n\approx\left(\frac{\alpha n}2+\sqrt\beta\right)^2$$

Note that:

$$y(t+1)-y(t)=\alpha\left(\frac{\alpha t}2+\sqrt{y(0)}\right)+\frac{\alpha^2}4$$

$$\alpha\sqrt{y(t)}=\frac{\alpha^2t}2+\alpha\sqrt{y(0)}$$

That is,

$$y(t+1)-y(t)=\alpha\sqrt{y(t)}+\frac{\alpha^2}4$$

So the approximation is pretty decently close.

Thus, the inverse $z_n$ is given by

$$z_n\approx\frac2\alpha\left(\sqrt n-\sqrt\beta\right)$$

5
On

Here is a proof that $0\leq x_n \leq c (n+1)^2$ for $c = \max[\frac{\alpha^2}{4}, x_0]$, assuming $x_0>0$.

Proof (induction): Suppose $0\leq x_n \leq c(n+1)^2$ is true for some nonnegative integer $n$ (it clearly holds for $n=0$). We prove it also holds for $n+1$. Clearly $x_{n+1}\geq 0$. Then, we have \begin{align} x_{n+1} &= x_n + \alpha \sqrt{x_n} \\ &\leq c(n+1)^2 + \alpha c^{1/2}(n+1) \quad \mbox{[since it holds for $n$]}\\ &\leq (c^{1/2}(n+1) + \frac{\alpha}{2})^2 \\ &=(c^{1/2}n + c^{1/2} + \frac{\alpha}{2})^2 \\ &\overset{(a)}{\leq}(c^{1/2}n + 2c^{1/2})^2\\ &=c(n+2)^2 \end{align} where (a) holds because $c^{1/2} + \frac{\alpha}{2} \leq 2c^{1/2}$ (since $c$ was chosen to ensure $c \geq \frac{\alpha^2}{4}$). $\Box$


EDIT: Here is a bound with a coefficient $\alpha^2/4$ regardless of $x_0$ (along the lines of the discussion in the comments below).

Claim: $x_n \leq g(n+b)^2$ for all $n \in \{0, 1, 2, ...\}$, with $g=\alpha^2/4$ and $b = \frac{2\sqrt{x_0}}{\alpha}$.

Proof (induction): Suppose true for some $n \in \{0, 1, 2...\}$ (it indeed holds true for $n=0$). We prove for $n+1$. We have: \begin{align} x_{n+1} &= x_n + \alpha \sqrt{x_n} \\ &\leq g(n+b)^2 + \alpha g^{1/2}(n+b) \\ &= g(n+b)^2 + 2g(n+b) \quad \mbox{[since $\alpha = 2g^{1/2}$]}\\ &= g(n+1+b)^2 - g\\ &\leq g(n+1+b)^2 \end{align} $\Box$

6
On

Aha! So this is an answer to a comment that @SimplyBeautifulArt made. It is possible to solve

$$y(t+1) - y(t) = \alpha \sqrt{y}$$

For $t\in\mathbb{R}^+$, and we can have $y(0) = \beta$ for any $\beta \in \mathbb{R}^+$. I just thought it'd be nice to state such problems are solvable. Sadly it's computationally a nightmare but it is possible.

Pay close attention.

Start with the equation

$$g(\xi) = \xi + \alpha \sqrt{\xi}$$

The trouble I was having is that this funcion isn't holomorphic in a neighborhood of $0$, so I was getting divergent answers. But, if we invert it, we do get one.

Solving for $\xi$ in $g(\xi) = \xi + \alpha \sqrt{\xi}$, gives

$$\xi = \frac{\alpha^2}{2} - \frac{\alpha\sqrt{\alpha^2+4g(\xi)}}{2} + g(\xi)$$

Let $f(x) = \frac{\alpha^2}{2} - \frac{\alpha\sqrt{\alpha^2+4x}}{2} + x$, and $f(0) = 0$ and $f'(0) = 0$. Luckily $f$ is holomorphic in a neighborhood of $0$ now so long as $\alpha > 0$. It also follows that

$$f(g(x)) = x$$

Because of this our function can be conjugated to $x^n$, determining $n$ is equivalent to determining the order of the zero of $f$ at zero. Or that

$$f(x) = h^{-1}(h(x)^n)$$

for $|x| < \delta$ small and a nice unique $h$. Let $$f^{\circ t}(x) = h^{-1}(h(x)^{n^t})$$

such that $f(f^{\circ t}(x)) = f^{\circ t+1}(x)$. And, more importantly,

$$g(f^{\circ t}(x)) = f^{\circ t-1}(x)$$. Let

$$y(t) = f^{\circ -t}(x)$$ which is defined on $\mathbb{R}$ by analytic continuation for $0 < x < \delta$, and

$$g(y(t)) = y(t+1)$$

Here comes the magic

$$g(y(t)) = y(t) + \alpha \sqrt{y(t)} = y(t+1)$$

and

$$y(t+1) - y(t) = \Delta y = \alpha \sqrt{y}$$

Finding $\tilde{y}$ such that $\tilde{y}(0) = \beta$ simply involves shifting the value $t$. Take care to notice that $y$ is analytic as well and tends to $0$ as $t \to -\infty$ and tends to $\infty$ as $t \to \infty$.

That's what I was looking for!

4
On

Some notes on the asymptotic behavior of the sequence in questions.

  • You are iterating the function $f(x) = x + \alpha \sqrt{x}$. For positive $\alpha$, this will obviously produce a growing sequence, and it is not hard to see that it will grow to infinity.
  • If you define $g(x) = \sqrt{x^2 + \alpha x}$, you can verify that $f(x^2) = g(x)^2$.
  • Defining $y_n = \sqrt{x_n}$, you can see that $y_n = g(y_{n-1})$.

Consider the expansion (valid for large $x$)

$$g(x) = \sqrt{x^2 + \alpha x} = x \sqrt{1 + \frac{\alpha }{x}} = x \left( 1 + \frac{\alpha}{2x} - \frac{\alpha^2}{8x^2} + o(x^{-2})\right)$$ $$= x + \frac{\alpha}{2} - \frac{\alpha^2}{8x} + o(x^{-1})$$

We can thus infer that, for large $n$

$$y_n \approx \frac{\alpha n}{2}$$

Feeding this back into the asymptotic expansion, we can show further that:

$$y_n \approx \frac{\alpha}{4}(2n - \log n)$$

Deriving extra terms in this expansion is possible, but not necessarily enlightening.

One can then infer that, for large $n$,

$$x_n \approx \frac{\alpha^2}{16}\left(2n - \log n\right)^2$$

This does not really touch on the dependence on $\beta$, which is harder to broach by studying asymptotics.