I am a graduate student in Communications Engineering and I have an optimization course. I need help in solving my assignments: this one is related to convex optimization.
We are asked to compute the subgradient for some problems, thus I wanna know how to compute the subgradient of the $p$-norm $\|\mathbf{x}\|_p$.
I know that the subgradient of a function at point $\mathbf{x}$ can be found by solving the condition
$$
f(\mathbf{z})\geq f(\mathbf{x})+\mathbf{g^T(z-x)}.
$$
Since this function has not any gradient at $x=0$, we aim to find subgradient at this point. After putting $\mathbf{x}=0$, I have $\|\mathbf{z}\|_p\geq\mathbf{g^Tz}$. But I don't know how to solve the last equation: how should I proceed?
2026-03-25 10:57:15.1774436235
how to compute subgradient of p-norm, $1 \leq p \leq \infty$
783 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
Nobody answered me! But I've found the solution finally. Thanks all!
The inequality can be solved by taking the Holder inequality theorem into account.