Showing minimizer satisfies equality of general norm.

117 Views Asked by At

I've been doing some self-studying on convex optimization recently. I'm trying to prove the fact that: $$\min_{x \in \mathbb{R}^n} y \cdot x + \frac{c}{2} \lvert x \rvert^2 = \frac{-1}{2c} \lvert y \rvert_*^2,$$ where $c>0$, $y \in R^n$, and || denotes a general norm and $||_*$ denotes the dual norm.

I've been trying to prove this fact using results about strong convexity and L-smoothness but to no avail - could anyone help me prove it?

Thanks!

1

There are 1 best solutions below

2
On

By (a generalized) Cauchy-Schwarz inequality, \begin{align*} | y^T x | \le \|y\|_* \|x\|. \end{align*} So $y^T x \ge -\|y\|_* \|x\|$. Then \begin{align*} y^T x + \frac{c}{2} \|x\|^2 \ge -\|y\|_* \|x\| + \frac{c}{2}\|x\|^2. \end{align*} The right-hand side can be minimized by letting $\|x\| = \frac{\|y\|_*}{c}$. Then we have \begin{align*} y^T x + \frac{c}{2} \|x\|^2 \ge -\|y\|_* \|x\| + \frac{c}{2}\|x\|^2 \ge -\frac{1}{2c}\|y\|_*^2. \end{align*} If the equality is tight, then we get what needed. It suffices to show there exists a $x$ with $\|x\| = \|y\|_*/c$ such that $y^T x = -\|y\|_*\|x\|$. This is indeed the case. By definition of dual norm, we have $\|y\|_* = \sup_{\|x\| =1} y^T x $. Since $\{x : \|x\|=1\}$ is compact and $y^Tx$ is linear, then the sup is indeed a max. So there is some $\|\bar{x}\| = 1$ such that $\|y\|_* = y^T \bar{x}$. Now take $\tilde{x} = -\frac {\bar{x}} {c /\|y\|_*}$, then we have $\|\tilde{x}\| = \|y\|_*/c$ and $y^T \tilde{x} = -\frac{\|y\|_*}{c} y^T \bar{x} = -\|y\|_* \|\tilde{x}\|$. This shows the equality can be achieved.