Minimization problem $f(x_1,x_2)=x_{1}^2+x_2+e^{x_{1}^2+x_{2}^2}$

80 Views Asked by At

"Show that the function $f:\mathbb{R}^2\rightarrow \mathbb{R}$ given by $f(x)=x^2_{1}+x_{2}+e^{x^2_{ 1 }+x^2_{2}}$ has a single point stationary and that such a point is a global minimizer."

Minimization problem, this question asks to find the stationary point of $f$ and demonstrate that it is the global minimum, but after performing the partial derivatives of $f$ and equating them to zero, we find:

$2x_1+2x_{1}e^{x_{1}^2+x_{2}^2}=0$ and $1+2x_{2}e^{x_{1}^2+x_{2}^2}=0$.

We find that $x_1=0$, but $e^{x_{1}^2+x_{2}^2}\geq 1$, there is no real value for x_2. This is strange, looking at the graph of f, it has a global minimum at $(0,-\dfrac{1}{2})$.

2

There are 2 best solutions below

2
On

After having computed the gradient, you will find for the derivative about $x_2$:

$$f'_{x_2} (x_1, x_2) = 1 + 2x_2 e^{x_1^2 + x_2^2}$$

Consider $g(x_2) = 1 + 2x_2 e^{x_1^2 + x_2^2}$ where now $x_1$ is just a parameter we do not care about.

$g(x_2)$ is continuous and regular. Also:

$$\lim_{x_2\to +\infty} g(x_2) = +\infty \qquad \lim_{x_2\to -\infty}g(x_2) = -\infty$$

hence because of its continuity, there must be at least one point $x_2$ for which $g(x_2) = 0$.

We can also show that $g(x_2)$ is monotonically increasing, whence this point is unique.

So we get that there does exist a point $(x_1, x_2)$ for which the gradient of your initial function is zero.

Your initial function is the sum of convex functions, whence it's again convex.

This proves $(x_1, x_2)$ is a minimum point for your function $f(x_1, x_2)$ and it's unique.

Notice that the request is to show there does exist a point, not to calculate it.

1
On

The gradient is

$$\nabla f = (2x_1(1+e^{x_1^2+x_2^2}),1+2x_2e^{x_1^2+x_2^2}) = 0$$

The first equation only has a solution when $x_1 = 0$, which means the second component reduces to

$$2x_2e^{x_2^2} = - 1 $$

$$2x_2^2e^{2x_2^2} = \frac{1}{2}$$

$$x_2 = -\sqrt{\frac{1}{2}W\left(\frac{1}{2}\right)}$$

since the first line implies $x_2<0$, where $W(\cdot)$ is the Lambert-W or Product Log function. Alternatively you could use the intermediate value theorem to prove that the equation in the first line only has one solution.

Then computing the second derivative

$$Hf = \begin{pmatrix}2+2e^{x_1^2+x_2^2}(1+2x_1^2) & 4x_1x_2e^{x_1^2+x_2^2} \\ 4x_1x_2e^{x_1^2+x_2^2} & 2e^{x_1^2+x_2^2}(1+2x_2^2)\end{pmatrix}$$

the determinant at the point $(x_1,x_2) = \left(0,-\sqrt{\frac{1}{2}W\left(\frac{1}{2}\right)}\right)$ is

$$4(e^{x_2^2}+e^{2x_2^2})(1+2x_2^2) > 0 $$

(in fact this quantity would be positive regardless of what value $x_2$ was exactly) and $f_{22},f_{11} > 0$, thus the point is a global minimum.