Find min of $f (x) = x_1^2 + 4x_2^2 -4x_1 -8x_2$ and prove starting point $(0,0)$ won't converge to it in finite steps

45 Views Asked by At

Find the minimum and prove that the starting point $(0,0)$ won't converge to it in a finite number of steps in the algorithm of gradient descent with exact linear search. Is there a starting point that will converge in a finite number of steps?

$$f (x) = x_1^2 + 4x_2^2 -4x_1 -8x_2$$

$$\frac{\partial f}{\partial x_1} = 2x_1-4=0\implies x_1=2$$

$$\frac{\partial f}{\partial x_2} = 8x_2-8\implies x_2=1$$

$$\frac{\partial^2 f}{\partial x_1^2} =2$$

$$\frac{\partial^2 f}{\partial x_2^2} =8$$

$$\frac{\partial^2 f}{\partial x_2x_1} =0$$

The Hessian is positive definite and therefore $(2,1)$ is a minimum.

Now I need to prove the starting point $(0,0)$ won't converge to $(2,1)$ in a finite number of steps if we use exact linear search with direction $-\nabla f$. Here's what it looks like:

$x^{k+1} = x^k - \lambda_k \nabla f(x^k) =$ $(x_1^{k+1}, x_2^{k+1}) = (x_1^{k}, x_2^{k}) -\lambda_k \nabla f((x_1^{k}, x_2^{k}) )$

I thought of putting this monter inside $f(x)$ but things will get huge, I don't think this is the way to go.