How to find normal vector of hyperplane in $n$-dimensional space?

158 Views Asked by At

Let $p$ be a hyperplane in $n$-dimensional space. (So, $p$ is $\mathbf r \cdot \mathbf n = d$ for some $d \in \mathbb R$.)

In the case where $n=3$, we can pick any two non-parallel vectors $\mathbf u$ and $\mathbf v$ on $p$ and we have $\mathbf n = k(\mathbf u \times \mathbf v)$ (for some $k\neq 0$).

But in higher dimensions ($n>3$), given linearly independent vectors $\mathbf v_1$, $\mathbf v_2$, $\dots$, $\mathbf v_{n-1}$ on $p$, how might we similarly find $\mathbf n$?

2

There are 2 best solutions below

1
On BEST ANSWER

Create a parametrised vector of the hyperplane. In $n$ dimensions you have: $$p=\mathbf v_0+\lambda_1 \mathbf {v_1}+\lambda_2\mathbf {v_2} +\cdots+\lambda_{n-1}\mathbf{v_{n-1}}$$

Where $\lambda_k$ are $n-1$ scalars of the field the space is over.

Then all normal vectors $\vec n$ have the property that

$$(p-\mathbf {v_0})\cdot \vec n=0$$

Thus, $$\begin{align}0 &=(\lambda_1 \mathbf {v_1}+\lambda_2\mathbf {v_2} +\cdots+\lambda_{n-1}\mathbf{v_{n-1}})\cdot\vec n\\ &=\left(\begin{pmatrix}\mathbf {v_1} & \mathbf {v_2} &\cdots &\mathbf {v_{n-1}}\end{pmatrix}\vec \lambda \right)^T\vec n\\ &=(\vec\lambda)^T\begin{pmatrix}\mathbf {v_1}^T\\ \mathbf {v_2}^T\\ \vdots \\ \mathbf {v_{n-1}}^T\end{pmatrix}\vec n \end{align}$$

And since this must be true for any $\vec \lambda$

$$\begin{pmatrix}\mathbf {v_1}^T\\ \mathbf {v_2}^T\\ \vdots \\ \mathbf {v_{n-1}}^T\end{pmatrix}\vec n=\mathbf 0$$

So $\vec n$ is exactly the kernel (nullspace)

$$\operatorname{ker} \begin{pmatrix}\mathbf {v_1}^T\\ \mathbf {v_2}^T\\ \vdots \\ \mathbf {v_{n-1}}^T\end{pmatrix}$$

or the cokernel (left nullspace) of the matrix of any set of vectors that form a basis for the direction of the affine-subspace $p$

$$\operatorname{coker}\big(\begin{matrix}\mathbf {v_1}& \mathbf {v_2}&\cdots&\mathbf {v_{n-1}}\end{matrix}\big)$$

0
On

Collect the $v_k$ vectors into a matrix $$\eqalign{ \def\qiq{\quad\implies\quad} \def\o{{\tt1}} \def\bbR#1{{\mathbb R}^{#1}} \def\LR#1{\left(#1\right)} \def\op#1{\operatorname{#1}} \def\rank#1{\op{rank}\LR{#1}} \def\mc#1{\left[\begin{array}{r|r}#1\end{array}\right]} V &= \mc{v_1&v_2&\cdots&v_{n-1}} \in \bbR{n\times(n-1)} \\ }$$ and calculate its Hat Matrix $$\eqalign{ H &= V\LR{V^TV}^{-1}\,V^T \\ }$$ Notice that $$\eqalign{ \LR{I-H}V = 0 \qiq \LR{I-H}v_k = 0 \\ }$$ but for almost any other (i.e. random) vector $\,r\in\bbR{n}\,$ one obtains $$\eqalign{ v_n &= \LR{I-H}r \\ }$$ which is orthogonal to every column of $V$ $$\eqalign{ v_n^Tv_k &= r^T\LR{I-H}v_k = 0 \\ }$$ $\sf NB\!:$ The Hat Matrix can be expressed using the pseudoinverse as $\,H = VV^+$
and thus $\LR{I-H}$ is an ortho-projector into the nullspace of $V$