Consider the problem minimize $f(x_1,x_2) = (x_2 −x_1^2)(x_2 −2x_1^2)$

1.9k Views Asked by At

(i) Show that the first- and second-order necessary conditions for optimality are satisfied at $(0,0)^T$.

(ii) Show that the origin is a local minimizer of f along any line passing through the origin (that is, $x_2 = mx_1$).

(iii) Show that the origin is not a local minimizer of f (consider, for example, curves of the form $x_2 = kx_1^2$)

Okay, so for a point to satisfy both the first and second-order necessary conditions for optimality, it must satisfy:

I: gradient of the function at the point $= 0 $(stationary point)

II: hessian matrix of the function at the point must be positive semidefinite.

While the sufficient conditions for optimality are: I: gradient of the function at the point $= 0 $ II: hessian matrix of the function at the point must be positive definite. And a point that meets them can be considered a strict local minimum.

Therefore, for the part (i)

I computed the gradient of the function at the point $(x_1,x_2)=(0,0)$ and it is equal to zero.

And the Hessian Matrix: $\pmatrix{-6·x_2+42·x_1^2&-6·x_1\\ -6·x_1 &2}$ that has eigenvalues $0,2 ≥0$ at the origin and therefore is a Positive Semi-Definite Matrix.

Therefore, both necessary conditions are met. Part (i) is complete.

For part (ii): I have to prove the origin as a local minimizer, so I first begin trying to prove the sufficient conditions to see if it then can be stated as a strict local minimizer.

1) Gradient Column: $\pmatrix{2·x_2·m-3·(2·x_1·x_2+x_1^2·m)+8·x_1^3\\ 2·x_2-3·x_1^2}$ Which equals $0$ at the origin (condition 1 met)

2) Hessian Matrix: $\pmatrix{2·m^2-3·(2·x_2+4·x_1·m)+24·x_1^2 &2·m-6·x_1\\ 2·m-6·x_1 &2}$ particularized for the origin: $\pmatrix{2·m^2&2·m\\ 2·m &2}$ And has eigenvalues: $(0,1/(2·m^2+2))$ which are ≥0 so the matrix is PSD (not PD) and we can't therefore conclude that the origin meets the sufficient conditions (and we can't say it is a local minimizer).

What other way can I work through in order to prove the origin as a local minimizer?

For part (iii) I only have to prove the necessary conditions wrong, which is easy, so there are no doubts about that part.

Thanks for the help and time spent reading over this. Regards!

1

There are 1 best solutions below

3
On BEST ANSWER

For part (ii), showing that the origin is a local minimizer of $f$ along any line $x_2 = mx_1$ is basically a one-variable minimization problem, so there is no need to compute a Hessian matrix. You need to show that $g_m(x_1) = f(x_1,mx_1) = (mx_1-x_1^2)(mx_1-2x_1^2)$ has a local minimum at $x_1 = 0$.

For part (iii), You already showed that the origin does satisfy the necessary conditions for being a minimizer. However, you need to show that despite satisfying the necessary conditions, the origin is NOT a local minimizer of $f$. Hint: for any $t \neq 0$ we have $f(t,\frac{3}{2}t^2) = -\frac{1}{4}t^2 < 0 = f(0,0)$. Based on this, does $f$ attain a local minimum at $(0,0)$?