So I have the following 3D co-ordinates that marks the start and end point of a line:
$\left(\begin{array}{c}1\\ 1\\1\end{array}\right)$ $\left(\begin{array}{c}-1\\ -1\\1\end{array}\right)$
The point from where I need to find the distance is given by:
$\left(\begin{array}{c}\sqrt{1/8}\\ \sqrt{1/8}\\\sqrt{3/4}\end{array}\right)$
Is it possible to do it using vector algebra? I am actually writing a computer program where this needs to be implemented, what will be the simplest formula to do it?
Thanks and regards.
You can express any point on the line as $$v = a + \lambda (b-a)$$
Now, the point which will have smallest distance will be foot of perpendicular from given point to the line. Hence, we find $\lambda$ for which $$(v-p).(b-a)=0 $$ $$\implies ((a-p)+\lambda(b-a)).(b-a)=0$$ $$\implies \lambda = \frac{(p-a).(b-a)}{|b-a|^2}$$
Hence, we can obtain $v$, and to find distance, we simply do $$d=|v-p|$$
This should be easy enough to implement in code