I see that when $f_{xx}\times f_{yy}-f_{xy}^2<0$ then it is a saddle point. Also when $f_{xx}\times f_{yy}-f_{xy}^2>0$ then it is a minima or maxima. What is exactly happening when $f_{xx}\times f_{yy}-f_{xy}^2=0$?
Finding the minimum or maximum of a bivariate function when $f_{xx}\times f_{yy}-f_{xy}^2=0$.
899 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
For a regular $f(x,y)$ we have around a point $p_0 = (x_0,y_0)$ with $p = (x,y)$
$$ f(x,y) = f(x_0,y_0) + f_x(p_0)(x-x_0)+f_y(p_0)(y-y_0) + \frac 12(p-p_0)^{\top}J(p_0)(p-p_0)+O(|p-p_0|^3) $$
If at $p_0$ we have a relative minimum/maximum then $f_x(p_0) = f_y(p_0) = 0$ so the characterization is done with the help of
$$ J(p_0) = \left(\begin{array}{cc}f_{xx}&f_{xy}\\f_{yx}&f_{yy}\end{array}\right)_{p_0} $$
if $\det\left(J(p_0)\right) = 0$ then we have that the quadratic associated form is the product of two lines as for example $(x+y)^2$ characterizing a parabolic point. If $J(p_0) = 0$ the qualification should be done with the first non null high order differential included in $O(|p-p_0|^3)$
On
An indefinite Hessian is a sufficient but not necessary condition for a stationary point to be a saddle point. For example the function given by by $x^4-y^4$ has a (unique) stationary point $(x,y)=(0,0)$ at which the determinant of the Hessian is zero.
Consider also the functions given by $(x+y)^2$, $-(x+y)^2$, $(x+y)^3$. In all cases the determinant of the Hessian is zero for all real numbers $x$ and $y$. In the first case the minimum is zero and there are many minimizers ($(t,-t)$ for any $t$) which are all stationary points. In the second case the maximum is zero and there are many maximizers ($(t,t)$ for any $t$) which are all stationary points. In the third case there are no maximizers or minimizers, but there are lots of stationary points ($(t,-t)$ for any $t$). These are all saddle points.
In case you are not aware of what a Hessian is: the Hessian matrix arrays the second order derivatives: $\begin{pmatrix}f_{xx}(x,y)& f_{xy}(x,y)\\ f_{yx}(x,y) & f_{yy}(x,y)\end{pmatrix}$
The determinant of Hessian is negative ($f_{xx}f_{yy}-f_{xy}^2<0$) if and only if the Hessian is indefinite.
I'd like to mention the role of the eigenvalues of the hessian and relate this to the other answers.
At a critical point $p_0=(x_0,y_0)$ of a function $f(x,y)$ (i.e. $\nabla f(p_0)=0$) the local shape of the graph of $f$ around $p_0$ is determined by the hessian $$ \mathrm{Hess}(p_0)=\begin{bmatrix} f_{xx}(p_0) & f_{xy}(p_0) \\ f_{xy}(p_0) & f_{yy}(p_0) \end{bmatrix}.$$
This is a symmetric matrix and thus diagonalisable. The two eigenvectors of the hessian point in the direction in which the function increases or decreases the most. The eigenvalues $\lambda_1$, $\lambda_2$ tell us whether the function rises of decreases in the direction of these eigenvectors. If the eigenvalues are non-zero, we get
These conditions can be expressed in terms of $\det \mathrm{Hess}(p_0)$, but the eigenvalue approach is more general: it also works for functions with more than two variables.
If $f_{xx}(p_0) f_{yy}(p_0)-f_{xy}^2(p_0) =0$, that is $\det \mathrm{Hess}(p_0) = 0$, then at least one eigenvalue is zero. In the case the other eigenvalue is non-zero, the extremum is not isolated and we can still determine the type.
If both eigenvalues are zero, then there is no general conclusion, as shown in the examples of smcc. You should then look at the higher order terms of the Taylor expansion of $f$ as Cesareo mentioned in his answer.