Show that any norm in $\mathbb{R}^{m}$ is a convex function. If $f: \mathbb{R}^{m} \to \mathbb{R}$ é a norm which comes from a inner product, prove that, for any $x \neq 0$ and $h \in \mathbb{R}^{m}$ $$\mathrm{d}^{2}f(x)\cdot h= (|h|^2|x|^2 - \langle x,h \rangle^2)|x|^{-3}$$ and note that the convexity of $f$ is equivalent to Schwartz inequality.
The first part is easy, follows from the definition of norm. For the second part, I know that $$\underbrace{\mathrm{d}^{2}f(x)}_{\text{Hessian}}\cdot h^2 = \sum_{i,j}\frac{\partial^2 f}{\partial x_{i}\partial x_{j}}\alpha_{i}\alpha_{j}$$ where $h=(\alpha_{1},...,\alpha_{m})$. But I couldn't go on. Maybe, my difficult is to calculate the Hessian.
I proved in a previus question that if $f(x) = |x|^a$, then $$\mathrm{d}f(x)\cdot v = a|x|^{a-2}\langle x,v \rangle,$$ so taking $a = 1$, $df(x)\cdot v = |x|^{-2}\langle x,v \rangle$ or $\displaystyle \frac{\partial f}{\partial x_{i}}(x) = |x|^{-2}x_{i}$ and $\displaystyle \frac{\partial^2 f}{\partial x_{i}\partial x_{j}}(x) = -2|x|^{-3}x_{i}x_{j}$ and writing the Hessian we have $$\mathrm{d}^{2}f(x)\cdot h^2 = \sum_{i,j}-2|x|^{-3}x_{i}x_{j}\alpha_{i}\alpha_{j}.$$
This does not seem to make sense to me. Can anybody help me?
First $a-2=-1$ for $a=1$, hence the expression of the first derivative is wrong.
Also in the expression $\frac{\partial^2}{\partial x_ix_i}f$ the term $|x|^{-2}$ is missing.
Observe that $$ \langle x,h\rangle = \sum_i \alpha_i x_i $$ and $$ \langle x,h\rangle^2 = \sum_i \alpha_i x_i \sum_j \alpha_j x_j = \sum_{i,j} \alpha_i \alpha_j x_i x_j $$