Gradient descent in optimization

95 Views Asked by At

Let $f(x) = \frac{1}{2} x^T Qx - b^Tx$, where $b\in\mathbb R^n$ and $Q\in \mathbb R^{n\times n}$ is a symmetric positive-definite matrix.

I need to show that the gradient of the function f is Lipschitz continuous $||\nabla f(y) - \nabla f (x)|| \le L||y-x||$ where $L$ is a real constant which equals the largest eigenvalue of the matrix $Q$. I'm told to use this inequality: $f(x+y)-f(x) \le \nabla f(x)^T y + \frac{1}{2} L||y||^2$.

Moreover, using the method of steepest decent $x_{k+1} = x_k -sD\nabla f(x_k)$ where $s>0$ and $D$ is a symmetric positive-definite matrix, I need to prove that the method is globally convergent to the global minimum of the function f if $s\in(0,\frac{2}{K})$, where $K$ is the largest eigenvalue of matrix $D^\frac{1}{2}$ which is a symmetric square of $D$ $$D=D^{1\over 2}D^{1\over 2}$$