I have the following question: Consider a function $f:R^n \longrightarrow R$, s.t.:
there is a point $x_0 \in R^n$ s.t. $\frac{\partial f}{\partial x^k} =0$ $\forall k$.
the hessian matrix ${\partial^2 f \over \partial x^i \partial x^j}$ is positive definite for all $x \in R^n$ but not necessary constant.
Now, it is the point $x_0$ a global minimizer of $f$ ?? I think yes, but how to prove it?
If the Hessian is positive definite then the function is convex. A differentiable convex function satisfies $f(x)-f(y) \ge Df(y)(x-y)$ for all $x,y$. Hence if $Df(x_0) = 0$, then $x_0$ must be a (global) minimiser.
Proof of above result: Let $\phi(t) = f(y+th)$, then since $\phi$ is convex, we have $\phi(t)-\phi(0) \le (1-t) \phi(0)+t \phi(1) - \phi(0) = t (\phi(1)-\phi(0))$, and so ${\phi(t)-\phi(0) \over t } \le \phi(1)-\phi(0)$. Taking limits gives $\phi(1)-\phi(0) \ge \phi'(0)$.
If we let $h = x-y$, and note that $\phi'(0) = Df(y)(y-x)$, we obtain the desired result.
An similar and very useful result is that the function $R$ defined here is monotonically non decreasing in either of its variables (with the other fixed).