Suppose I have a smooth scalar function $f(x,y)$ where $x,y$ can be vectors. I am interested in finding a saddle-point:
$$\min_x \max_y f(x,y)\qquad(1)$$
I have the intuition that at such a point, the Hessian of $f$,
$$\partial^2 f = \left(\begin{array}{cc} \frac{\partial^2 f}{\partial x^2} & \frac{\partial^2 f}{\partial x \partial y}\\ \frac{\partial^2 f}{\partial x \partial y} & \frac{\partial^2 f}{\partial y^2} \end{array}\right)$$
satisfies that: 1) the upper-left block $\partial^2f/\partial x^2$ is positive definite, and 2) the lower-right block $\partial^2f/\partial y^2$ is negative definite.
My question is whether these conditions are enough to guarantee that the point is indeed a saddle-point that solves (1), at least if the domain is restricted to a local neighborhood.
The answer is no.
Define the function $\psi(x)$ by the condition:
$$\max_y f (x, y) = f (x, \psi (x))$$
That is, $y=\psi(x)$ is the value of $y$ that maximizes $f(x,y)$ for a fixed $x$. It follows that:
$$\frac{\partial f}{\partial y} (x, \psi (x)) = 0, \qquad \frac{\partial^2 f}{\partial y^2} (x, \psi (x)) < 0$$
where the second inequality is to be interpreted as that the Hessian matrix with respect to the $y$ variables is negative definite at this point.
This definition also determines the derivative of $\psi(x)$ with respect to $x$, because:
$$\frac{d}{d x} \left( \frac{\partial f}{\partial y} (x, \psi (x)) \right) = \frac{\partial f}{\partial x \partial y} (x, \psi (x)) + \frac{\partial^2 f}{\partial y^2} (x, \psi (x)) \frac{\partial \psi (x)}{\partial x} = 0$$
which implies:
$$\frac{\partial \psi (x)}{\partial x} = - \left( \frac{\partial^2 f}{\partial y^2} (x, \psi (x)) \right)^{- 1} \frac{\partial f}{\partial x \partial y} (x, \psi (x))$$
Note that here, $\psi(x)$ is a vector valued function. Therefore the derivative is actually a Jacobian matrix, and the right-hand side of this equation is a matrix product between the inverse Hessian of $f$ with respect to $y$, and the mixed second derivatives with respect to $x,y$.
Now we return to the original optimization problem:
$$\min_x \max_y f (x, y) = \min_x f(x,\psi(x))$$
The minimization of $f(x,\psi(x))$ with respect to $x$ is guaranteed by the two usual conditions:
$$\frac{\partial f (x, \psi (x))}{\partial x} = 0, \qquad \frac{\partial^2 f (x, \psi (x))}{\partial x^2} > 0$$
where the derivatives are "total" in the sense that $\psi(x)$ must also be differentiated. We have
$$\frac{\partial f (x, \psi (x))}{\partial x} = \frac{\partial f}{\partial x} (x, \psi (x)) + \frac{\partial f}{\partial y} (x, \psi (x)) \frac{\partial \psi (x)}{\partial x} = \frac{\partial f}{\partial x} (x, \psi (x))$$
because $f(x,y)$ is stationary in $y$ at $y=\psi(x)$. Next,
$$\frac{\partial^2 f (x, \psi (x))}{\partial x^2} = \frac{\partial^2 f}{\partial x^2} (x, \psi (x)) + \frac{\partial^2 f}{\partial x \partial y} (x, \psi (x)) \frac{\partial \psi (x)}{\partial x}$$
Substituting the derivative of $\psi(x)$:
$$\frac{\partial^2 f (x, \psi (x))}{\partial x^2} = \frac{\partial^2 f}{\partial x^2} (x, \psi (x)) - \frac{\partial^2 f}{\partial x \partial y} (x, \psi (x)) \left( \frac{\partial^2 f}{\partial y^2} (x, \psi (x)) \right)^{- 1} \frac{\partial^2 f}{\partial x \partial y} (x, \psi (x))$$
In summary, we find the following conditions for the original optimization problem:
$$\frac{\partial f}{\partial x} = 0, \quad \frac{\partial f}{\partial y} = 0, \quad \frac{\partial^2 f}{\partial y^2} < 0, \quad \frac{\partial^2 f}{\partial x^2} - \frac{\partial^2 f}{\partial x \partial y} \left( \frac{\partial^2 f}{\partial y^2} \right)^{- 1} \frac{\partial^2 f}{\partial x \partial y} > 0$$
The first two conditions are the normal gradient stationary condition. The third condition comes from the inner maximization on $y$. The last condition is the tricky one, demanding the positive definiteness of a certain combination of the blocks of the Hessian of $f$, which is different from the original conditions of the OP.
In fact, this is just the Shur's complement of the $y$-block of the Hessian. I'm not sure if its eigenvalues can be related in some way to the eigenvalues of the full Hessian.