Prove that there exists a shortest distance between two skew lines

1.6k Views Asked by At

In every geometry textbook it states that the shortest distance between two skew lines (lines that are not co-planar) is given by the unique line that runs perpendicular to both skew lines. This is fairly simple to prove given a shortest line exists (see my comments below). However, how can we prove that between any two skew lines there exists a shortest straight path?

I have tried using calculus to show that for lines $L_1 = \mathbf{p}+s\mathbf{u}$ and $L_2 = \mathbf{q} + t\mathbf{v}$ the equation:

$$R(s,t) = \Vert \mathbf{p}+s\mathbf{u}-(\mathbf{q}+t\mathbf{v})\Vert^2$$

  • Has a local minimum

However I am not quite able to show that (without getting into page and pages of calculations) and what is more, even after I have shown this, it only proves a local minimum exists.

3

There are 3 best solutions below

4
On

It can be done. We have $$\frac{\partial R}{\partial s} (s,t)= 2(\mathbf{p}-\mathbf{q}+s\mathbf{u}+t\mathbf{v})\cdot \mathbf{u}, \quad \frac{\partial R}{\partial t} (s,t)= 2(\mathbf{p}-\mathbf{q}+s\mathbf{u}+t\mathbf{v})\cdot \mathbf{v}.$$

The condition for local extremum is $$\frac{\partial R}{\partial s} (s_0,t_0) = \frac{\partial R}{\partial t} (s_0,t_0)=0$$ so such $s_0,t_0$ satisfy $\mathbf{p}-\mathbf{q}+s_0\mathbf{u}+t_0\mathbf{v} \perp \mathbf{u},\mathbf{v}$. Therefore, there is a scalar $\alpha$ such that $$\mathbf{p}-\mathbf{q}+s_0\mathbf{u}+t_0\mathbf{v} = \alpha(\mathbf{u} \times \mathbf{v}).$$ Note that $\mathbf{u}$ and $\mathbf{v}$ are linearly independent (since the lines are skew) so $\{\mathbf{u}, \mathbf{v}, \mathbf{u} \times \mathbf{v}\}$ is a basis for $\Bbb{R}^3$ and hence $s_0, t_0, \alpha$ exist and are unique. Hence, if we know that this is a local minimum (e.g. by calculating the Hessian), it has to be a global minimum.

Scalar multiplying the above relation by $\mathbf{u} \times \mathbf{v}$, we get $$(\mathbf{p}-\mathbf{q})\cdot (\mathbf{u} \times \mathbf{v}) = (\mathbf{p}-\mathbf{q}+s_0\mathbf{u}+t_0\mathbf{v}) \cdot (\mathbf{u} \times \mathbf{v})= \alpha \|\mathbf{u} \times \mathbf{v}\|^2$$ so $$\alpha = \frac{(\mathbf{p}-\mathbf{q})\cdot (\mathbf{u} \times \mathbf{v})}{\|\mathbf{u} \times \mathbf{v}\|^2}.$$ The minimal distance is now given by $$\|\mathbf{p}-\mathbf{q}+s_0\mathbf{u}+t_0\mathbf{v}\| = \alpha \|\mathbf{u} \times \mathbf{v}\| = \frac{(\mathbf{p}-\mathbf{q})\cdot (\mathbf{u} \times \mathbf{v})}{\|\mathbf{u} \times \mathbf{v}\|}.$$

0
On

Here is quick proof, making heavy use of matrix analysis. Let $A$ denote the matrix whose columns are $\mathbf u, -\mathbf v$, let $\mathbf x$ denote the column vector $\mathbf x = (s,t)$, and let $\mathbf b = \mathbf q - \mathbf p$.

The function that we are trying to minimize is $$ R(\mathbf x) = \left\|A \mathbf x - \mathbf b \right\|^2. $$ In other words, we are looking for the least-squares solution to the equation $A\mathbf x = \mathbf b$. There are many derivations/justifications of the solution $\mathbf x = (A^TA)^{-1}A^T\mathbf b$, one of which is given here.


Here is another proof: I claim (without proof) that because distances are fixed under rotation and translation, we can assume without loss of generality that $\mathbf q = 0$ and $\mathbf v = (0,0,1)$. With that, we find $$ (s\mathbf u + \mathbf p) - (t \mathbf v + \mathbf q) = \\ (su_1 + p_1, su_2 + p_2, su_3 - t + p_3 - q_3). $$ With the substitution $k = su_3 - t + (p_3 - q_3)$, this is simply the vector $$ (su_1 + p_1, su_2 + p_2, k). $$ Of course, we can rearrange $$ k = u_3\,s - t + (p_3 - q_3) \implies t = u_3\,s - k + (p_3 - q_3). $$ In other words, the change of coordinates $(s,t) \mapsto (s,k)$ is bijective. So, minimizing $R(s,t)$ is equivalent to minimizing $R(s,k)$.

Now, it is easy to see that $R(s,k)$ attains a minimum, since $$ R(s,k) = \|(su_1 + p_1, su_2 + p_2, k)\|^2 = (su_1 + p_1)^2 + (s u_2 + p_2)^2 + k^2, $$ which means that $R$ is minimized at $s = s_0, k=0,$ where $s_0$ is the value of $s$ that minimizes $(su_1 + p_1)^2 + (s u_2 + p_2)^2$.


Here is a proof by "completing the square." Expand the inner product $$ | s \mathbf u - t\mathbf v + (\mathbf p- \mathbf q) |^2 = \\ ( s \mathbf u - t\mathbf v + (\mathbf p- \mathbf q))\cdot ( s \mathbf u - t\mathbf v + (\mathbf p- \mathbf q)) =\\ s^2 \| \mathbf u\|^2 - 2st (\mathbf u \cdot \mathbf v) + t^2 \|\mathbf v\|^2 + s \mathbf u \cdot (\mathbf p - \mathbf q) - t\mathbf v \cdot (\mathbf p - \mathbf q) + |\mathbf p - \mathbf q|^2. $$ The constant term plays no roll, which is to say that it suffices to minimize the function $$ s,t \mapsto s^2 \| \mathbf u\|^2 - 2st (\mathbf u \cdot \mathbf v) + t^2 \|\mathbf v\|^2 + s [\mathbf u \cdot (\mathbf p - \mathbf q)] - t[\mathbf v \cdot (\mathbf p - \mathbf q)]. $$ To simplify things, rewrite our function $$ R(s,t) = s^2 \| \mathbf u\|^2 - 2st (\mathbf u \cdot \mathbf v) + t^2 \|\mathbf v\|^2 + cs + dt + C, $$ where $C$ is some constant and we simply note that $c = \mathbf u \cdot (\mathbf p - \mathbf q)$ and $d = \mathbf v \cdot (\mathbf p - \mathbf q)$ are real numbers.

Take out a perfect square $(\|\mathbf u\|s - \frac{\mathbf u\cdot \mathbf v}{\|\mathbf u\|}t)^2$ to get $$ R(s,t) = (\|\mathbf u\|s - \frac{\mathbf u\cdot \mathbf v}{\|\mathbf u\|}t)^2 + (\|\mathbf v\|^2 - \frac{(\mathbf u \cdot \mathbf v)}{\|\mathbf u\|^2})t^2 + cs + dt + C. $$ Importantly, we note that $\|\mathbf v\|^2 - \frac{(\mathbf u \cdot \mathbf v)}{\|\mathbf u\|^2} > 0$ as a consequence of the Cauchy-Schwarz inequality, i.e. that for non-parallel $\mathbf u,\mathbf v$, we have $$ |\mathbf u \cdot \mathbf v| = \|\mathbf u\|\,\|\mathbf v\| \cdot |\cos \theta| < \|\mathbf u\| \|\mathbf v\|. $$ Thus, we have written $R(s,t)$ in the form $$ R(s,t) = a (s - kt)^2 + bt^2 + cs + dt + C, $$ with $a,b > 0$ and $c,d,k \in \Bbb R$. Noting that $cs = c(s - kt) + ckt + C$, we have $$ R(s,t) = a (s - kt)^2 + bt^2 + c(s-kt) + \bar d t + C\\ = [a (s - kt)^2 + c(s - kt)] + [bt^2 + \bar d t] + C. $$ With that, it suffices to note that the functions $$ f(x) = ax^2 + cx, \quad g(x) = bx^2 + dx $$ both attain minimums.


Here is a proof along the lines of a typical "real analysis" inequality. Note that $\inf_{s,t \in \Bbb R} R(s,t)$ refers to the greatest lower bound ("infimum") of $R(s,t)$ over all real $s,t$. This lower bound must exist because $R(s,t)$ is always non-negative.

First, note that we necessarily have $$ \inf_{s,t \in \Bbb R} R(s,t) \leq R(0,0) = |\mathbf p - \mathbf q|^2. $$ We note that there is a shortest distance between a point and a line. Because the lines are not parallel, $\mathbf u \neq \mathbf v$. Thus, there exist a $m_1,m_2 > 0$ such that for all $t$, $|\mathbf u - t \mathbf v| \geq m_1$ and $|\mathbf v - t \mathbf u| \geq m_2$.

Now note that for $|s| > c_1 = 2|\mathbf p - \mathbf q|/m_1$, we have $$ R(s,t) = |\mathbf p + s \mathbf u - \mathbf q - t\mathbf v|^2 \geq (|s\mathbf u - t \mathbf v| - |\mathbf p - \mathbf q|)^2\\ = (s| \mathbf u - (t/s) \mathbf v | - |\mathbf p - \mathbf q| )^2\\ \geq (sm_1 - |\mathbf p - \mathbf q| )^2\\ > (2|\mathbf p - \mathbf q| - |\mathbf p - \mathbf q|) = |\mathbf p - \mathbf q|. $$ Similarly, if $|t| > c_2 = |\mathbf p - \mathbf q|/m_2$, then $R(s,t) > |\mathbf p - \mathbf q|$.

It follows that $$ \inf_{s,t \in \Bbb R} R(s,t) = \inf_{|s|\leq c_1,|t|\leq c_2} R(s,t). $$ In other words, it suffices to consider $R(s,t)$ over the closed and bounded set of values $[-c_1,c_1]\times[-c_2,c_2]$. However, any real-valued function over a compact domain must attain its maximum and minimum. So, the lower bound over $[-c_1,c_1]\times[-c_2,c_2]$ (which is necessarily the lower bound over $\Bbb R \times \Bbb R$) is necessarily attained.

4
On

If you have two skew lines $a$ and $b$ it is easy to construct a line perpendicular to both, hence proving that it exists.

  1. Construct the plane $\beta$ containing $b$ and parallel to $a$.

  2. Construct the plane $\alpha$ containing $a$ and perpendicular to $\beta$.

  3. If $B$ is the intersection of $\alpha$ with $b$, then the line $AB$ passing through $B$ and perpendicular to $a$ is also perpendicular to $b$ and is thus the solution.

enter image description here

It is then immediate to show that $AB$ is the line of minimum distance: given any two points $P\in a$ and $Q\in b$, if $H$ is the projection of $P$ on $\beta$ we have:

$$ PQ^2=PH^2+HQ^2\ge PH^2=AB^2. $$