In the convex optimization textbook, page 475, Stepend Boyd defines the normalized steepest descent direction with repect to the norm $||.||$ as
$$ \Delta x_{nsd} = argmin \{ \bigtriangledown f(x)^T v \; | \; ||v|| \leq 1 \}$$
In the Appendix A of this book, the dual norm is defined as $$ || \bigtriangledown f(x) ||_{\ast} = \sup \{ \bigtriangledown f(x)^T v \; | \; ||v|| \leq 1 \}$$
I tried many ways to figure out the following conclusion $$\bigtriangledown f(x)^T \Delta x_{nsd} = - || \bigtriangledown f(x) ||_{\ast} $$
Unfortunately, I have not figured out any potential solution. Do you have any solution or suggestion to help me? Thank you very much!
HINT It boils down to understanding the dual norm. This dual norm has the nice property that $$ |\bigtriangledown f(x)^Tv|\leq\|\bigtriangledown f(x)^T\|_*\|v\| $$ for any vector $v$. So when $\|v\|\leq1$ we have $$ |\bigtriangledown f(x)^Tv|\leq\|\bigtriangledown f(x)^T\|_* $$ Then you need to figure out when equality is achieved.