I'm looking for a citable reference to fill in a gap in an intermediate step of a proof which requires convergence of a suboptimal version of steepest descent.
The function $f:\bf{R}^n\to\bf{R}^n$ I am optimizing is strictly convex and differentiable almost everywhere, and my steepest descent steps $$x_{n+1}=x_{n}-s_k\frac{\nabla{f(x_n)}}{\|{\nabla{f(x_n)}}\|},$$ have step sizes $s_k$ satisfying $s_k\rightarrow0^+$ and $\sum_k s_k=\infty.$ The step sizes $s_k$ are pre-specified, so it may occasionally happen that a particular step is counter-productive, ie $f(x_{n+1})>f(x_n)$.
Is there a citable reference which asserts that $x_n$ will converge to the minimum for almost every starting point $x_0$? (The sequence isn't well-defined for all $x$, so it isn't even defined with certainty.)
I don't care about the actual rate of convergence or any of the standard numerical analysis issues, since this is a proof and not a calculation.
This may be a little overkill, but in general what you are looking for is Zangwill's global convergence theorem, see e.g. http://www.math.udel.edu/~angell/gl_conv.pdf