Is the minimum angle between a cone and a vector achieved on an extreme ray?

91 Views Asked by At

Suppose we have a closed convex cone $C \subset \mathbb R^d$ with apex at the origin. Let $v$ be a vector outside the cone. In am interested in measuring how far the direction of the vector is from being in the cone. In particular the value $$\theta = \min\{\text{angle}(v,c): c \in C \}= \min\left \{\frac{v \cdot c}{\|v\|\|c\|}: c \in C\right \}.$$

The minimum is achieved due to compactness. I suspect it is always achieved on an an extreme ray of $C$. In particular if $C$ is an intersection of half-spaces the minimum is achieved for one of the normals to the planes.

The only idea I have to prove this is to represent it as minimizing a concave function over some convex domain. Perhaps a slice of the cone. However even changing coordinated to have $v =(1,0,\ldots,0)$ a unit vector the function is still nasty

$$\frac{v \cdot c}{\|v\|\|c\|}= \frac{c_1}{\sqrt{c_1^2 + \ldots + c_d^2}}$$

At least for $d=2$ the Hessian is negative. I wonder is this obvious in higher dimensions. Or is there an easier way to show concavity. Or indeed an easier way to how the angle is minimised on an extreme point?

1

There are 1 best solutions below

0
On

No. Consider $v=(0,0,-1)$ (yellow) and the two-dimensional cone with extreme rays $(1,1,0)$ (red) and $ (1,-1,0)$ (blue). The angle is minimised for the cone element $(0,1,0)$ (green).

enter image description here