Point farthest away from ray

252 Views Asked by At

I have a couple of points in 3D space and I need to find the one which has the biggest normal distance away from a ray defined by two points $P0$ and $P1$. When I calculate $d = |({P0 - P1})\times({P1 - P})|$ the magnitude of the cross product for each point $P$ and take the maximum, it seems that I get the point that has the biggest normal distance starting from the ray. I know that the magnitude of the cross product is equal to the area of the parallelogram spanned by both vectors. So my assumption is that the bigger the area of the parallelogram, the farther away a point is from the ray. But I can't prove that my assumption is correct.

1

There are 1 best solutions below

8
On BEST ANSWER

my assumption is that the bigger the area of the parallelogram, the farther away a point is from the ray

Consider a ray starting at the point $P_0 = (0, 0)$ and going infinitely far in the direction of $P_1 = (1, 0)$. Now consider a point $P_2 = (0, 1)$ and a point $P_3 = (-10,000, 1)$. We note that $\|P_3 - P_0\| > \|P_2 - P_0\|$, yet $\|(P_1 - P_0) \times (P_2 - P_0)\| = \|(P_1 - P_0) \times (P_3 - P_0)\|$.

A simple way to see this is just from thinking about the area of a parallelogram as $bh$, and thus we see that we only care about the height of the point, so we can translate it horizontally arbitrarily far.