how could I prove the following using Lagrange optimization?
Prove that the shortest distance from the hyperplane $$H= \{\vec{x} \in \mathbb{R}^{n} : \vec{a} \cdot\vec{x}=b\} $$ to a point $\vec{x}_{1} \in \mathbb{R}^{n}$ is $$d=\frac{|\vec{a} \cdot\vec{x}_{1}-b|}{||\vec{a}||}$$.
I've tried proving it by minimizing the square of the distance function, using as a constraint the fact the the point is over the hyperplane, but the algebra is killing me.
Thanks!
First let's do it in special case $\vec{x}_{1}=\vec{0}$
$d\left(\vec{0},H\right)=\left\Vert \vec{p}\right\Vert $ where $\vec{p}$ denotes the projection of $\vec{0}$ on $H$.
Then $\vec{a}.\vec{p}=b$ (i.e. $\vec{p}\in H$) and secondly $\vec{p}=\lambda\vec{a}$ for some scalar $\lambda$.
This leads to $\lambda=b\left\Vert \vec{a}\right\Vert ^{-2}$ and $d\left(\vec{0},H\right)=\left\Vert \vec{p}\right\Vert =\left|\lambda\right|\left\Vert \vec{a}\right\Vert =\left|b\right|\left\Vert \vec{a}\right\Vert ^{-1}$.
To generalize apply a translation (wich is isometric): $$d\left(\vec{x}_{1},H\right)=d\left(\vec{0},-\vec{x}_{1}+H\right)=d\left(\vec{0},H'\right)$$ where $H'=\left\{ \vec{x}\mid\vec{a}.\vec{x}=b'\right\} $ with $b'=b-\vec{a}.\vec{x}_{1}$. This follows from:
$$-\vec{x}_{1}+H=\left\{ \vec{x}-\vec{x}_{1}\mid\vec{a}.\vec{x}=b\right\} =\left\{ \vec{x}\mid\vec{a}.\left(\vec{x}+\vec{x}_{1}\right)=b\right\} =\left\{ \vec{x}\mid\vec{a}.\vec{x}=b-\vec{a}.\vec{x}_{1}\right\} =\left\{ \vec{x}\mid\vec{a}.\vec{x}=b'\right\} $$