When minimizing $f(x)=\frac{1}{2}x^THx\quad (H>0)$ , the steepest descent method derives the iteration $$x_{k+1}=x_k-\frac{x_k^TH^2 x_k}{x_k^TH^3 x_k}Hx_k$$.
The question is to prove the following $$\frac{\|x_{k+1}\|_2}{\|x_{k}\|_2}\leq\frac{\kappa-1}{\kappa+\frac{1}{\sqrt{2\kappa}}}$$ where $\kappa=\kappa_2(H)=\frac{\lambda_\max(H)}{\lambda_\min(H)}$ is the condition number derived by 2-norm of $H$.
As far as I consider, I square the left side and figure out
$$\frac{\|x_{k+1}\|_2^2}{\|x_{k}\|_2^2}=1-2\frac{(x_k^TH x_k)(x_k^TH^2 x_k)}{(x_k^Tx_k)(x_k^TH^3 x_k)}+\frac{(x_k^TH^2 x_k)^3}{(x_k^TH^3 x_k)^2(x_k^T x_k)}$$
It's homegenous both for $x$ and $H$ and hopefully has a maxima, but I have no idea how to proceed.
The left side can be rearranged into $$1-\frac{(x_k^TH x_k)^2}{(x_k^TH^2 x_k)(x_k^T x_k)}+\frac{x_k^TH^2 x_k}{x_k^T x_k}(\frac{x_k^TH x_k}{x_k^TH^2 x_k}-\frac{x_k^TH^2 x_k}{x_k^TH^3 x_k})^2\leq \left(\frac{\kappa-1}{\kappa+1}\right)^2+\frac{4\kappa}{(\kappa+1)^2}\left(\frac{\kappa-1}{\kappa+1}\right)^4$$
There's a well-known estimation for the convergence rates of $|f(x_k)|$ $$\frac{|f(x_{k+1})|}{|f(x_k)|}=1-\frac{(x_k^TH^2 x_k)^2}{(x_k^TH^3 x_k)(x_k^TH x_k)}\leq \left(\frac{\kappa-1}{\kappa+1}\right)^2$$ This result mainly uses Kantorovich's inequality or convexity of the left side, while these 2 methods can't be directly applied to the 2-norm estimation.