Can someone explain to me how to solve this question?
Find the shortest distance between the lines $L_1 = \left\{t \begin{bmatrix} 1\\ 1\\ 1\end{bmatrix} : t \in \mathbb{R}\right\}$ and $L_2 = \left\{s \begin{bmatrix} 1\\ 2\\ 3\end{bmatrix} + \begin{bmatrix} 1\\ 0\\ 0\end{bmatrix}: s \in \mathbb{R}\right\}$
Thanks

First take the cross product of the two direction vectors: $<1,1,1>\times<1,2,3>=<1,-2,1>$. We can normalize to the unit vector $\frac{1}{\sqrt6}<1,-2,1>$
The minimum distance will be the length of the scalar projection of any line segment joining L1 and L2 onto this unit vector. We have the points (0,0,0) on L1 and (1,0,0) on L2, which are joined by the vector <1-0,0-0,0-0>=<1,0,0>. The length of the scalar projection will then be the magnitude of the dot product of <1,0,0> and $\frac{1}{\sqrt6}<1,-2,1>$, which is $\frac{1}{\sqrt6}$