A subdifferential formulated as an argmax problem

257 Views Asked by At

I am reading the article "Random Variables, Monotone Relaitions and Convex Analysis" by Rockafellar and Royset. It is article number 226 on Rockafellar's website https://sites.math.washington.edu/~rtr/papers.html.

As the article introduces subdifferential of a proper, closed convex function in one dimension as the set $$\partial f(x) = \{p \ | \ f'^{-}(x) \leq p \leq f'^{+}(x) \}, $$ where $f'^-, f'^+ $ are the one-sided derivatives of f, it goes on to state the following.

\begin{equation} \partial f^*(x) = {\arg \max}_p \{px - f(p) \}, \quad \partial f(x) = {\arg \max}_p \{px - f^*(p) \}, \label{eq:1} \end{equation} where $f^*$ is the convex conjugate of f, defined as $f^*(p) = \sup_x \{xp - f(x) \}$.

Is this a well-known result with a proof that can be found in literature? I attempted to look at this directly, but did not get past the statement

\begin{align} p' \in \partial f(x) \Longleftrightarrow p'\in\{p \ | \ f'^-(x) \leq p \leq f'^+(x)\}, \end{align} failing to see how $p'$ maximizes $\{px - \sup_{x} \{xp - f(x)\} \}$.

1

There are 1 best solutions below

0
On BEST ANSWER

First, we need the following two properties.

  1. Given convex function $f$ and $x^* \in \mathbb{R}^n$, \begin{equation} 0 \in \partial f(x^*) \Leftrightarrow x^* \in \arg\min_x f(x). \end{equation}
  2. And we have \begin{equation} y \in \partial f(x) \Leftrightarrow x \in \partial f^{*}(y) \end{equation} One can refer to this problem "Proof about Conjugate and subgradient" for a detailed proof.

Now, we have \begin{equation} \begin{aligned} x^* \in \arg\max_x \left\{\langle p, x\rangle-f(x) \right\} & \Leftrightarrow x^* \in \arg\min_x \left\{f(x)-\langle p, x\rangle \right\} \\ & \Leftrightarrow 0 \in \partial\left(f\left(x^*\right)-\left\langle p, x^*\right\rangle\right) \\ & \Leftrightarrow 0 \in \partial f\left(x^*\right)-p \\ & \Leftrightarrow p \in \partial f\left(x^*\right) \\ & \Leftrightarrow x^* \in \partial f^{*}(p) \end{aligned} \end{equation}