In the case of off-diagnal values of Hessian matrix are zero, what will be the implication?
I found a related question here.
If a function $f$ has the from $$f(x_1,x_2,x_3)=g_1(x_1)+g_2(x_2)+g_3(x_3)$$ then the second order derivative $$\frac{\partial^2}{\partial x_i \partial x_j} f(x) = 0 \qquad \forall i\neq j$$.
In the case of Hessian matrix, does it only implies the function is linear as @ErdelvonMises mentioned?
First recall the definition of the Hessian $H$ of a function $f$ $$ H_{i,j} = \partial_{x_i,x_j} f $$ For all the off-diagonal term to be zero (e.i. $H_{i,j} = 0 \;\;\;\; \forall ij : i \neq j$) $$ 0 = \partial_{x_i,x_j} f \;\;\;\; \forall ij : i \neq j$$ must happen, suppose $f$ is a linear combination of functions with dependency in no more than a variable $$ f = \sum_{m=1}^M a_m g_m(x_m) $$ so the $ 0 = \partial_{x_i,x_j} f \;\;\;\; \forall ij : i \neq j $ of $f$ is $$ \partial_{x_i,x_j} f = \sum_{m=1}^M a_m \partial_{x_i,x_j} g_m(x_m) = 0 $$ This trivially implies that if $f$ is a linear combination of functions with dependency in no more than a variable, then the off-diagonal entries will be zero. And I conjecture the before mentioned condition is nesesary and sufficient to the off-diagonal entriess of the Hessian be zero.
For gain some intuition of why it would no apply to other cases. Suppose $f$ is a multivariate polynomial of degree two $$ f = a + \sum_{n = 1}^N \sum_{m=1}^2 x_n^m + \sum_{n = 1}^N \sum_{m = 1}^M c_{n,m} x_n x_m $$ We again we get the Hessian off-diagonal derivatives $$ \partial_{x_i,x_j} f = \sum_{n = 1}^N \sum_{m = 1}^M c_{n,m} \partial_{x_i,x_j} x_n x_m \neq 0 \,\,\,\, \text{for} \,\, i \neq j$$