Gradient & Newton method for optimization

352 Views Asked by At

I have the following function to minimize using both gradient and Newton's method

$\mathcal f(x_1,x_2)= 36x_1+36x_2+5x_1^2+5x_2^2+8x_1x_2$

Using an initial points $x^0=0$ for the gradient method, I found out that the next point $x^1$ for the first iteration is $(-4,-4)$ and also the point after, in general I get the same result from the beginning and it doesn't change. My question is , is this the optimum?? and if there is anymore explanation for this result can you explain it to me ?

the second second question is about the Newton's method, I found some problems applying this one can any one help me, figure out how to use the Newton's method to minimize this function ? (I don't want the result, just how to apply the method)

1

There are 1 best solutions below

4
On BEST ANSWER

I would say

$x_1 = u+v\\ x_2 = u-v$

$f(u,v) = $$5u^2 + 10uv + 5v^2 + 5u^2 - 10uv + 10v^2 + 8u^2 - 8v^2 + 36u + 36v + 36u - 36v\\ 18u^2 + 2v^2 + 72u\\ 18(u+4)^2 + 2v^2 - 72$

The minimum is at $(-4,0)$ in $u,v$ space and $(-2,-2)$ in $x,y$ space.

Alternatively you could find $x_1,x_2$ such that:

$\nabla f(x_1,x_2) = \mathbf 0$

Provided that $\frac {\partial^2 f}{\partial^2 x_1}\frac {\partial^2 f}{\partial^2 x_2} - (\frac {\partial^2 f}{\partial x_1\partial x_2})^2>0$

$\nabla f(x_1,x_2)$ always points in the direction to most quickly increase the value of $f(x_1,x_2)$, and $-\nabla f(x_1,x_2)$ points in the direction to reduce it. If $f(x_1,x_2)$ is at a minimum, there is no way to go that will reduce it, therefore $\nabla f(x_1,x_2) = \mathbf 0$

This is a necessary but not sufficient condition. If $\nabla f(x_1,x_2) = \mathbf 0$ you could be at a maximum, a minimum or a saddle. Enter the second derivative tests.

$\frac {\partial^2 f}{\partial^2 x_1}\frac {\partial^2 f}{\partial^2 x_2} - (\frac {\partial^2 f}{\partial x_1\partial x_2})^2<0$ means you have a saddle.

$\frac {\partial^2 f}{\partial^2 x_1}\frac {\partial^2 f}{\partial^2 x_2} - (\frac {\partial^2 f}{\partial x_1\partial x_2})^2>0$

Indicates either a maximum or a minimum.

$\frac {\partial^2 f}{\partial^2 x_1}, \frac {\partial^2 f}{\partial^2 x_2} < 0$ implies a maximum.

$\frac {\partial^2 f}{\partial^2 x_1}, \frac {\partial^2 f}{\partial^2 x_2} > 0$ implies a minimum.

I am not sure how I would apply Newton's method. Newton's method is great for finding 0's, but not necessarily minima.