Convergence of a Recursive Sequence - An Example

1.4k Views Asked by At

Consider the sequence $\displaystyle x(k+1) = \frac{1}{2}\left(x(k) + \frac{a}{x(k)}\right)$ where $x(k)$ stands for the $k$th term of the sequence. What does this process converge to, and what is the order of convergence?

Is this something that should be done by induction or is there a better way to go about this? Thanks!

2

There are 2 best solutions below

2
On BEST ANSWER

Assume $\lim_{k\to \infty}x_{k} =b$, then we have

$$ b=\frac{1}{2}(b+\frac{a}{b}). $$

Now, solve for $b$.

0
On

Well, clearly we don't want to end up with any $x(k)=0$, since that will cause us problems. Fortunately, $x(k+1)=0$ if and only if $x(k)=-\frac{a}{x(k)}$ if and only if $x(k)^2=-a$. We won't run into any problems if $a>0$, then. So long as we don't put $x(0)=0$, we'll be fine for $a=0$. For $a<0$, we just need to make sure that we don't put $x(0)=(-a)^{1/2^n}$ for any positive integer $n.$

That done, we have a well-defined sequence. What about potential limits? Well, if the sequence converges in $\Bbb R$ to a non-zero limit, say $x$, then using the recursion formula and taking the limit as $k\to\infty$ on both sides, we have $$x=\frac12\left(x+\frac{a}x\right),$$ whence $x^2=a$. That can't happen at all if $a<0$, so the sequence won't converge in that case. You should be able to show that the sequence can't converge to a limit of $0$ if $a>0$, and that it converges to $0$ if $a=0$.

You've only got two possible limits if $a>0$, namely $\pm\sqrt a$. You should be able to show that it converges to $\sqrt{a}$, at least for a good initial choice of $x(0)$.