Apologies for the basic question.
Let us say that the general convex problem is $$\min_x f(x).$$
Say, we employ a subgradient method $$x^{(k)} = x^{(k-1)} - t_k \partial f\left( x^{(k-1)}\right) ,$$ where $t_k$ is a step-size and $\partial f$ is a subgradient.
Question: Can we say $x^{(k)} \approx x^{(k-1)}$ for iteration $k \rightarrow \infty$?
Not unless the step size $t_k$ decays to $0$. Otherwise, you can take $f(x) = |x|$ as a counterexample. For example, suppose at iteration $k$ you are at $x = 1$. Then the sub-gradient of $f$ at $x$ is $1$. If $t_k = 2$ (for all $k$), then the update rule says $x^{(k+1)} = 1 - (2)(1) = -1$. Now, the sub gradient is $-1$ and the update rule says $$x^{(k+2)} = -1 - (2)(-1) = 1$$ In other words, the iterations keep bouncing between $\pm 1$ and never converge.
When $t_k \to 0$, then gradient descent will converge to the minimum of the convex function $f$, and so $x^{(k)}$ will be approximately $x^{(k+1)}$.
However, you usually need the assumption that $f$ is at least Lipschitz in order to prove results on how long it takes for this convergence to occur.