When in an interactive direction method for solving a problem in $\mathbb{R}^n$ (be it optimization, zero of function, etc), the difference of the iterates is given by $$ x^{k+1} - x^k = t^kd^k $$ For simplicity, let's consider the unitary step, so that the difference is only $d^k \in \mathbb{R}^n$.
If $d^k \rightarrow 0$, we have that the sequence is convergent.
And in fact if there is an infinite subset $\mathbb{N}' \subset \mathbb{N}$ such that $d^k \stackrel{\mathbb{N}_1}{\rightarrow} 0$, then the iterate sequence $\{x^k\}_{k \in \mathbb{N}}$ has a limit point, say $x^k \stackrel{\mathbb{N}_1}{\rightarrow} \overline{x}$.
I want to know if I can prove some kind of converse: namely, if there is no infinite natural subset such that $d^k$ converges to zero on that subset, then can one conclude the sequence doesn't have a limit point? Is that true?
I tried doing by contradiction. Suppose that $d^k > \epsilon$ for arbitrarily large $k$, and assume the sequence has a limit point on some infinite subset. If in the infinite subset of convergence, the difference of two "consecutive" iterates is $p$, then you have that $$ ||x^{k+p}-x^k||=||\sum_{i=k}^{k+p-1}d_i|| \geq \lvert ||d_j|| - ||\sum_{i=k, i \neq j}^{k+p-1}d_i|| \rvert|$$ for every $j \in \{k, \cdots k+p-1\}$, can I then prove somehow that at least one of the $d_i$s has to be smaller than an arbitrary number, thus reaching a contradiction? I'm trying but without luck.