Linear convergence of sequence

313 Views Asked by At

I have the following exercise but it seems to me that this is false :

Let $(x_n)_{n \in \mathbb{N}}$ by a sequence of real numbers and $x^* \in \mathbb{R}$. We say that the sequence $(x_n)$ converges to $x^*$ at the order $p > 0$ if :

$$\exists C > 0, \forall n \in \mathbb{N}, \mid x_{n+1}-x^* \mid \leq C \mid x_n-x^* \mid^p$$

If $p =1$ we say that the sequence converges linearly. When $p = 1$ prove that $ C \in ]0,1[$.

I don't understand why $C$ need to be in $]0,1[$. I mean I can just take $(x_n) = 0$ (the nul sequence) and for this sequence and for example $C = 10$ the above inequality is clearly true.

So what is the problem with what I am saying ?

Thank you !

1

There are 1 best solutions below

0
On

To make the question answered.

Yes, the claim that $C$ need to be in $]0,1[$ is wrong.

I think that for $p=1$ we need $C\in]0,1[$ to provide convergence of $(x_n)$ to $x^*$. Otherwise, for instance, a sequence $x_n=C^n$ satisfies the definition for $x^*=0$ but does not converges to it.