Minimum distance from origin in $\mathbb{R}^3$ but Hessian is indefinite

93 Views Asked by At

Problem: Consider the Set $M= \lbrace (x,y,z) \in \mathbb{R}^3 \mid x^2+2y^2-z^2=1 \rbrace$ and find all Points on $M$ which have minimal euclidian distance from the origin.

My approach: I defined the function $f(x,y,z):= x^2 + y^2+z^2$ and try to minimize this function on $M$. The function $f$ represents the euclidian distance squared, however it is clear that minimizing this function on $M$ will also minimize the regular euclidian distance.

Define $g(x,y,z)=x^2+2y^2-z^2-1$ given by the constraint of $M$.

Compute Lagrange Function: $L=f- \lambda g= (1- \lambda)x^2+ (1-2 \lambda)y^2 + (1 + \lambda)z^2 + \lambda$
Now I have to solve the given set of equations: \begin{align} \frac{\partial L}{\partial x}&= 2(1-\lambda)x=0 \tag{I} \\ \frac{\partial L}{\partial y}&= 2(1-2\lambda)y=0 \tag{II}\\ \frac{\partial L}{\partial z}&= 2(1+\lambda)z=0 \tag{III} \end{align} I know that $(x,y,z)\neq(0,0,0)$ because $(0,0,0) \notin M$. Which means $x,y,z$ cannot simultaneously vanish.

My Problems: I manage to solve the above system of equations, but I run into results I cannot successfully interpret. Let me show: The above system of equations seems to grant me a bit of a freedom.

$\bullet$ Assume that $x \neq 0 \implies x= \pm 1$ because of the constraint and also $\lambda =1 $ because of I. It then also has to follow that $y=z=0$. So I get the points $p_0=(1,0,0), p_1=(-1,0,0)$.

$\bullet$ Assume that $y \neq 0 \implies y = \pm \frac{1}{\sqrt{2}}$ because of the constraint and $\lambda = 1/2$ because of II. It has to follow that $x=z=0$ and I get two more points $p_3=(0,1/\sqrt{2},0)$ and $p_4=(0,-1/\sqrt{2},0)$

$\bullet$ Assume that $z \neq 0 \implies z= \pm i$ ??? That is a rather odd result but after some thinking about it I came to the conclusion that $z\neq 0 \implies \lambda= -1\implies x=y=0$ can never be true, because of the constraint and $(0,0, \pm i) \notin M$

The Hessian Matrix: The Hessian Matrix of $L$ will look as follows $$\text{Hess}_L=\begin{pmatrix} 2(1-\lambda) & 0 & 0 \\ 0 & 2(1-2\lambda) & 0 \\ 0 & 0 & 2(1 + \lambda) \end{pmatrix} $$ And for all points I have computed above it will be indefinite (having at least one $0$ Eigenvalue). So it seems like the Hessian Matrix gives me no information at all. I thought about trying to talk about the set and show that it is compact, but I don't manage to do so.

Plot: Here is the visualization of the set $M$ done by Mathematica.

Visualisation of M

1

There are 1 best solutions below

3
On BEST ANSWER

You have correctly found four stationary points of $f$ on the surface $M$: they are $(\pm 1,0,0)$ and $(0,\pm 1/\sqrt{2},0)$.

That the Hessian of $L$ is indefinite does not tell you much about these stationary points. You may be thinking of the standard second derivative test for unconstrained optimization. But you have a constrained optimization problem, for which the second derivative test is different: it involves bordered Hessian.

But I would not advise you to use this complicated test. If the minimum is attained, it is attained at a stationary point. So, plug each of the stationary points into $f$ and take the smallest value.

And the reason the minimum is attained has to do with compactness. Although $M$ itself is not compact, its intersection with a large closed ball $B$ (say of radius $10$, centered at the origin) is compact. So, there is a point of $M\cap B$ at minimal distance from the origin. This same point also gives minimal distance for all of $M$, because the points of $M\setminus B$ are too far: they have distance at least $10$, thus lose in the competition to, say, $(1,0,0)$.