KKT necessary conditions - applied to linear sum function

25 Views Asked by At

I'm having issues interpreting the KKT conditions. Consider a very simple problem (from e.g. Economics)

$$ \max_{(x_1, x_2) \in \mathbb{R}^2_+} x_1 + x_2 \quad \text{s.t.} \quad p_1 x_1 + p_2 x_2 = w $$

We have

  • $f(x) = x_1 + x_2$ is convex and differentiable.
  • $h(x) = p \cdot x - w$ is affine.
  • Slater's condition holds ($h(x) = 0$ somewhere on $\mathbb{R}^2_+$!)
  • $\mathbb{R}^2_+$ is a convex subset of $\mathbb{R}^n$.

Thus, any points which satisfy the KKT conditions

  • $h(x) = 0$
  • $0 \in \partial f(x) + \lambda \partial h(x) $

Should be optimal. Indeed - I think KKT states these should be necessary and sufficient.

However, solving this and getting first order conditions yields

$$ 1 = \lambda p_1, 1 = \lambda p_2 \implies p_1 = p_2 $$

Clearly, this condition is never satisfied if you want to find the optimum when (for example) $p_1 > p_2$. However, clearly on inspection that optimum is on the boundary,

$$ x = \left( 0, \frac{w}{p_2} \right) $$

So my question is - why is the KKT first-order condition not a necessary condition for optimality in this case? As far as I can tell all the assumptions are satisfied and there is nothing about interior solutions - but maybe I'm missing one?

Edit:

Note that the maximisation takes place on the nonnegative real set. However in most definitions of KKT the only requirement is that the maximisation takes place on a convex subset, which is fine for the nonnegative reals, I think?