Local minima and maxima over the simplex

395 Views Asked by At

Consider a smooth function $f : \mathbb{R}^n \to \mathbb{R}$, and consider the simplex:

$$\Delta_n = \left\{{\bf x} \in \mathbb{R}^n:x_i \in [0,1] \wedge \sum_{i=1}^n x_i = 1\right\}.$$

Suppose that:

$$\underline{\bf x} = \arg \min_{{\bf x} \in \Delta_n} f({\bf x}) ~ \text{and} ~ \overline{\bf x} = \arg \max_{{\bf x} \in \Delta_n} f({\bf x}). $$

Consider now the function $g : \mathbb{R}^n \to \mathbb{R}$:

$$g({\bf x}) = f({\bf x}) + c\sum_{i=1}^n x_i,$$

for some real number $c \neq 0$. Can I conclude that:

$$\underline{\bf x} = \arg \min_{{\bf x} \in \Delta_n} g({\bf x}) ~ \text{and} ~ \overline{\bf x} = \arg \max_{{\bf x} \in \Delta_n} g({\bf x})? $$

I guess this is true, since the term $c\sum_{i=1}^n x_i$ is constant on $\Delta_n$.

Anyway, suppose for example that $\underline{\bf x}$ belongs to the interior of $\Delta_n$. This means that $\nabla f(\underline{\bf x}) = 0$. On the other hand, since it is also the minimum of $g$, then also $\nabla g(\underline{\bf x})$ should be null, but in this case:

$$\nabla g(\underline{\bf x}) = \begin{bmatrix}c\\c\\\vdots\\c\end{bmatrix} \neq 0.$$

1

There are 1 best solutions below

0
On BEST ANSWER

As you have already pointed out, for any feasible $x$, and therefore for argmin or argmax of $g$, $\sum_{i=1}^n x_i = 1$. Therefore, $g(x) = f(x) + c$, so argmin is the same for $f$ and $g$, as it is also is for argmax.

The optimality condition for $f$ is not and can not be $\nabla f(x) = 0$. Nor is or can the optimality condition for $g$ be $\nabla g(x) = 0$. That is because there is always an active constraint, $\sum_{i=1}^n x_i = 1$, so a feasible point can never be in the interior of $\Delta_n$.

See https://en.wikipedia.org/wiki/Karush%E2%80%93Kuhn%E2%80%93Tucker_conditions . In your case, the constraints are linear, so the linear constraint qualification holds, and the KKT conditions are necessary for optimality, presuming that $f(x)$ is continuously differentiable.