I am not asking anyone to do this for me. This question pops out of the blue, the ones before and after are trivial in comparison. I need hints:
If $\vec{p}$ is a fixed point and $\vec{x}(t) = \vec{a}t+\vec{b} $ is a line then show that the distance between the $\vec{p}$ and the line is
$$\left( (\vec{p}-\vec{b})\cdot \vec{a}\right)^2 \left( 1 - \frac{2}{\|a\|} \right)^2 + \left\|\vec{p}-\vec{b}\right\|^2 = \left\| \vec{a}\times (\vec{p}-\vec{b})\right\|^2$$
The expression on the RHS is intuitively straightforward. But I cannot call in any intuition, or even a method, for the LHS.
If this weren't such a formidable looking purely vector expression, I would have proceeded to find the minimum of $\left\|\vec{x}(t)-\vec{p}\right\|$. But I don't think that would work here, as this is a purely vector equation.
Update, I did it and get the expression,
$$\sqrt{ ||\vec{p}-\vec{b}||^2 - \left(\frac{(\vec{p}-\vec{b} )\cdot \vec{a}}{||\vec{a}||}\right)^2} = \frac{||\vec{a}\times (\vec{p}-\vec{b})||}{||\vec{a}||}$$
I am pretty sure this is the correct version of the above quoted expression.
For the expression on the right hand side to be correct (as the square of the distance), you have to assume that $\vec{a}$ is a unit vector. Then as Rahul wrote, your left-hand-side becomes simply
$$ [(\vec{p}-\vec{b})\cdot\vec{a}]^2 + \|\vec{p}-\vec{b}\|^2 $$
And this expression is still incorrect, but could be easily fixed, if you insert a minus sign
$$ -[(\vec{p}-\vec{b})\cdot\vec{a}]^2 + \|\vec{p}-\vec{b}\|^2 $$
That the left and right hand sides equal is just the following statement: given an orthonormal basis $\vec{i}, \vec{j}, \vec{k}$, any vector can be written as
$$ \vec{w} = w_i\vec{i} + w_j\vec{j} + w_k\vec{k} $$
The norm satisfies
$$ \|\vec{w} \|^2 = w_i^2 + w_j^2 + w_k^2 $$
and note that $w_i = \vec{w}\cdot \vec{i}$. Now, consider $\vec{w}\times \vec{i} = - w_j \vec{k} + w_k \vec{j}$. So you have that
$$\|\vec{w}\times \vec{i}\|^2 = w_j^2 + w_k^2 $$
So
$$ \|\vec{w}\|^2 = \|\vec{w}\cdot \vec{i}\|^2 + \|\vec{w}\times \vec{i}\|^2 $$
holds for any unit vector $\vec{i}$.
Geometrically this idea is simple: give the vector $\vec{w}$ and a unit vector $\vec{a}$, you can always decompose $\vec{w}$ into a vector that is parallel to $\vec{a}$ and one is orthogonal. And the squared norm of $\vec{w}$ will be equal to the sum of the square norms of the two components by Pythagorean theorem.
For actually deriving this statement using multivariable calculus, I would recommend following Didier's hint and minimize the square of the distance instead.