Compute the Hessian matrix $H_f(0,0)$ of the following functions $f:\mathbb{R}^2 \to \mathbb{R}$:
- $f(x,y) = 1+x+y+ \left \langle \begin{pmatrix} x \\ y \end{pmatrix} , \begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}\right \rangle + x^4 + y^4$
- $f(x,y) = 1+x+y+ \left \langle \begin{pmatrix} x \\ y \end{pmatrix} , \begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \end{pmatrix}\right \rangle + x^4 + y^4$
Of course, the computation is straightforward. I wonder, though: Is there anything special about that inner product (or possibly any other part of the terms)? Is there any way to interpret those/use them to our advantage?
Or is it just a contrived attempt of my instructors to let us practice with these objects (which I hope is not the case)?
Let's analyze in general $f: \Bbb R^n \to \Bbb R$ given by $$f(x) = f(x_1,\ldots, x_n) = 1 + x_1+\cdots + x_n + \sum_{k,\ell=1}^n a_{k\ell}x_kx_\ell + x_1^4 + \cdots + x_n^4,$$where $A=(a_{ij})$ is some matrix. We have that $$\frac{\partial f}{\partial x_i}(x) = 1 + \sum_{k=1}^n(a_{ik}+a_{ki})x_k + 4x_i^3, $$and so $$\frac{\partial^2f}{\partial x_i\partial x_j}(x) = a_{ij}+a_{ji} + 12x_i^2\delta_{ij},$$meaning that ${\rm Hess}\,f(0) = A+ A^\top$.
I would say that this exercise had the objective of perhaps making you wonder about what happens when differentiating a bilinear form (meaning, the $\langle x,Ax\rangle$ term). We in particular see that ${\rm Hess}\langle\cdot, A\cdot\rangle = A+A^\top$, since the part $1+\cdots + x_n$ gets killed in the first differentiation and the part with fourth powers does not contribute at the origin.