In my assignment I have the following question,
True of false:
Let $a_{n}$ be a sequence. If $$\lim\limits_{n\to\infty} (a_{2n}-a_{n})=0$$ Then $a_{n}$ is convergent.
The statment is false and the following example, which I don't understand, prove it:
$a_n = \begin{cases} 1 & n=2^k, k\in \Bbb N \\ 0 & n\ne2^k \end{cases}$
However, I don't understand why.
My main question is: What does the subtraction $(a_{2^k}-a_n)$ do?
In my understanding, the sequence $a_{2^k}$ takes all the indexes that are equal to 2 in power of k, meaning 2,4,8,... and gives them the value $1$. However in the sequence $a_n$ we have a lot of values, such as 0,1,0,1,0,0,0,1,...
So if I make a subtraction I get the following:
$$(a_{2^k}-a_n)=(1-0,1-1,1-0,1-1,...)$$ Since in my understanding the subsequence always gets the value $1$.
Where did I get it wrong? Clearly, I am getting it VERY wrong.
You help is appriciated.
Alan
The key is that $2n$ is only a power of $2$ exactly when $n$ is a power of $2$. So the subtraction always gives you $0$. But on its own, the series flips from $0$ to $1$ and so never converges.