Continuity of KKT Multipliers

95 Views Asked by At

Suppose I have a function $f:X \times \Theta \rightarrow \mathbb{R}$, $n$ choice variables $\bf{x} \in X \subset \mathbb{R}^{n} $ and $m$ parameters $\bf{\theta} \in \Theta \subset \mathbb{R}^{m} $. $f$ is strictly concave and $\bf{x}$ and continuous in $X \times \Theta $.

I want to maximize $f(\bf{x};\bf{\theta})$ subject to non-negativity constraints for $\bf{x}$.

\begin{align} \max_{\bf{x}\geq 0} f(\bf{x};\bf{\theta}) \end{align}

One can formulate the following Lagrangean for the Karush-Kuhn-Tucker Problem,

\begin{align} \mathcal{L} = f(\bf{x};\bf{\theta}) + \sum_{n=1}^{N}\lambda_{n}x_{n} \end{align}

yielding the first order (KKT) conditions for all $n$

\begin{equation} \frac{\partial f(\bf{x};\bf{\theta})}{\partial x_{n}} = \lambda_{n} \end{equation}

\begin{equation} \lambda_{n}x_{n} = 0 \end{equation}

\begin{equation} \lambda_{n} \geq 0, x_{n} \geq 0 \end{equation}

I want to show two things,

  1. The maximizer $\bf{x}^{*}(\theta)$ is continuous and differentiable in $\theta$
  2. The $\bf{\lambda}^{*}(\theta)$ is continuous and differentiable in $\theta$ \end{itemize}

To prove (1), I think I can readily invoke Berge's Maximum Theorem, by choosing an appropriate upper bound for $\bf{x}$. Then, by strict concavity the optimizers should be single-valued and continuous. Can I prove differentiability?

For the proposition in (2), to prove the continuity of the multipliers can I use the same theorem? How about differentiability?

From first principles, these multipliers can be written as

\begin{equation} \lambda_{n} = \max\left(0,-\frac{\partial f(\bf{x};\bf{\theta})}{\partial x_{n}}\right) \end{equation}

If $\frac{\partial f(\bf{x};\bf{\theta})}{\partial x_{n}}$ is continuous in $\bf{\theta}$, then the $\lambda$ should be continuous as well. However, I do not think it is always differentiable.

Any thoughts on how to show the continuity of $\lambda$?

1

There are 1 best solutions below

0
On

Even when $f$ is smooth, neither the maximizer nor the multiplier are differentiable. Of course, the trouble happens at the point where the constraint activates.

To give a simple example, let $n = m = 1$, take $X = \Theta = \mathbb{R}$ and $$ f(x,\theta) := - \frac 12 (x-\theta)^2 $$ which is smooth and strictly concave in $x$. Obviously, the unique maximum of $f$ over $\mathbb{R}_+$ is given by $x(\theta) = \theta$ when $\theta \geq 0$ and $x(\theta) = 0$ when $\theta < 0$. Moreover, $\lambda(\theta) = 0$ when $x \geq 0$ and $\lambda(\theta) = -\theta$ when $\theta < 0$. Hence, neither function is differentiable at $\theta = 0$.

Concerning the continuity of $\lambda$, it looks like you already have a proof. If $\theta \mapsto x(\theta)$ is continuous, and $f \in C^1$, then your formula yields the continuity of $\lambda$.