For what $a\neq0$ does $a_{n+1}=1/a_{n}+a_{n}/2$ the sequence converge

50 Views Asked by At

The problem is stated as:

Let $(a_n)_{n=1}^{\infty}$ where $a_1 = a \neq 0$, for what $a\neq0$ does $a_{n+1}=1/a_{n}+a_{n}/2$ is the sequence converging?

My attempt

I'm thinking about using the fix - point theorem. So, we have to show 2 things:

  • $f: I \mapsto I$ for some closed interval $I = [a,b]$
  • That there $\exists C < 1 : |f(x)-f(y)| < C|x-y|$

Let $f(x) := 1/x + x/2$, differentiating we get:

$f'(x) = -1/x^2 + 1/2 = \frac{x^2-2}{2x^2}$ which is monotonically decreasing on the interval $(-\sqrt2,0) \cup(0, \sqrt2)$ and monotincally increasing on the interval $(-\infty,-\sqrt2)\cup (\sqrt2, \infty)$. They might be potentially good intervals to choose later.

To show that there $\exists C < 1 : |f(x)-f(y)| < C|x-y|$, we apply the mean value theorem, and find that our $C = 1/2$ from our derivative, which of course is strictly less than 1.

Before we choose our interal I, let's try and solve for $f(x) = x$. We get:

$x = 1/x + x/2$ which gives us the solutions $x=\pm \sqrt2$. Of course, the fix point is unique, and has to be within our chosen interval I. Furthermore, the fix - point theorem states that the convergence is independent on the starting point $a_1$ for our interval I. However, I don't really know how to move on from this. I'm thinking that we can choose that $a \in [-\sqrt2,0) \cup(0, \sqrt2]$ as our only possibility, since that's the closest to a closed interval.

Thank you.

1

There are 1 best solutions below

5
On

It is enough to consider the problem for $a>0$ as if we have two sequences with $a_{0} = a$ and $b_{0} = -a$ with the same above recurrence relation then $a_{n} = - b_{n}$.

From

$$(x-\sqrt{2})^2 \geq 0$$

we have

$$\frac{1}{x}+\frac{x}{2} \geq \sqrt{2}\text{ for }x \geq 0$$

Thus $a_{1},a_{2},...$ are all atleast $\sqrt{2}$. Furthermore

$$ \frac{1}{x}+\frac{x}{2} \leq x \text{ for }x \geq \sqrt{2}$$

Hence we have

$$a_{1} \geq a_{2} \geq ...\geq \sqrt{2}$$

Thus $a_{1},a_{2},...$ is a monotonically decreasing sequence which conveges to some value $c$ atleast $\sqrt{2}$ by the monotone convergence theorem. Now $c$ must satisfy the equation $c = \frac{1}{c}+\frac{c}{2}$; thus $c \in \{-\sqrt{2},\sqrt{2}\}$, but $c \geq \sqrt{2}$ thus $c = \sqrt{2}$.

Hence we have the following;

If $a > 0$ the sequence converges to $\sqrt{2}$

If $a < 0$ the sequence converges to $-\sqrt{2}$