I have two skew lines in $\mathbb{R}^N$ ($N > 2$) defined as $\vec{x} = \vec{x}_A + \vec{d}_A t$ and $\vec{x} = \vec{x}_B + \vec{d}_B s$ ($t, s \in \mathbb{R}$). Now, I'd like to calculate the shortest distance between those lines. In 3D, this seems to be rather simple since the cross product $[\vec{d}_A \times \vec{d}_B]$ is a vector. However, in $\mathbb{R}^N$, there is an infinite number of vectors that are perpendicular to $\vec{d}_A$ and $\vec{d}_B$ and that lie on a subset $H^{\perp}$ of dimension $N - 2$.
My question is: How can one calculate the minimal distance without generalizing the cross product to $N$ dimensions?
You could compute the minimum of
$$ d(s,t)=\Vert(\vec x_A+\vec d_At)-(\vec x_B+\vec d_Bs)\Vert=\Vert(\vec x_A-\vec x_B)+\vec d_At-\vec d_Bs\Vert $$
using basic analysis. In more detail: the above gives you a function $\mathbb R^2\rightarrow \mathbb R$. Compute its gradient, and look for zeros.
Hint: Even easier, use $d(s,t)^2$.