how to compute subgradient of p-norm, $1 \leq p \leq \infty$

783 Views Asked by At

I am a graduate student in Communications Engineering and I have an optimization course. I need help in solving my assignments: this one is related to convex optimization.
We are asked to compute the subgradient for some problems, thus I wanna know how to compute the subgradient of the $p$-norm $\|\mathbf{x}\|_p$. I know that the subgradient of a function at point $\mathbf{x}$ can be found by solving the condition $$ f(\mathbf{z})\geq f(\mathbf{x})+\mathbf{g^T(z-x)}. $$ Since this function has not any gradient at $x=0$, we aim to find subgradient at this point. After putting $\mathbf{x}=0$, I have $\|\mathbf{z}\|_p\geq\mathbf{g^Tz}$. But I don't know how to solve the last equation: how should I proceed?

1

There are 1 best solutions below

0
On

Nobody answered me! But I've found the solution finally. Thanks all!

The inequality can be solved by taking the Holder inequality theorem into account.