How do you get an orthogonal vector in arbitrary dimensions?

433 Views Asked by At

The context is, I am trying to create a program to calculate the gradient of a discretized surface, i.e. a mesh. The trick however is, I need to do this on arbitrary dimensions.

I was given this formula:

enter image description here

That numerator is all fine and dandy in 3D, since to get the orthognal of that line you can just define the $\perp$ operator as $N\times (x_k - x_j)$, where $N$ is the normal to the triangle.

This however relies on the cross product, which does not exist in most dimensions. If instead of a mesh I have a hyper mesh, how can I get a vector that is both orthogonal to the hyper edge and contained in the hyper face?

To clarify, the diagram I have posted is one of the key steps needed to compute a discrete gradient on a 3D mesh (i.e a 2D manifold). In that setting the formula I posted is well defined through the operator I described earlier. It will give you a vector that is both contained in the triangle and orthogonal to the edge.

Consider now the same pattern one dimension above, i.e. a 3D manifold embedded in a 4D space. Each hyper face is a tetrahedron in 4D. The extension of the above formula would be; to find a vector that is embedded in the 3D subspace spanned by the vectors of the tetrahedron; and orthogonal to a vector that is orthogonal to all the vectors in that subspace (the hyper normal $N$).

One of the answers to this question describes a beautiful way to calculate $N$, however that's only half the problem, we need a vector that is orthogonal to $N$ and to one of the faces of the tetrahedron (in the example).

2

There are 2 best solutions below

0
On BEST ANSWER

This answer contains a solution which has the following properties

  1. It works even if the given vectors are linearly dependent.
  2. It works in floating point arithmetic.
  3. It requires only $O(n^3)$ arithmetic operations.

Let $x_1, x_2, \dots, x_k$ denote $k$ vectors in $\mathbb{R}^n$ with $k \leq n$ and $X \in \mathbb{R}^{n \times k}$ denote the (tall) matrix whose $j$th column is $x_j$. Let $X = QR$ denote a QR factorization of $X$. Then $Q \in \mathbb{R}^{n \times n}$ is orthogonal, i.e. $Q^TQ=I_n$ is the $n$-by-$n$ identity matrix and $R \in \mathbb{R}^{n \times k}$ is upper triangular. It follows that $$ V = \text{span}\{x_1, x_2, \dotsc, x_k\} \subseteq \text{span}\{q_1, q_2, \dotsc, q_k\}$$ where $q_j$ is the $j$th column of $Q$. Moreover, if $k<n$, then $q_{k+1}, q_{k+2}, \dotsc q_n$ are all perpendicular to $V$ which is more than what is required here.

The QR factorization can be computed using the modified Gram-Schmidt method with or without column pivoting, using Givens rotations or using Householder reflectors. LAPACK implements Householder reflectors. The use of orthogonal transformations avoids many numerical issues which appear in real applications. The cost of doing a QR factorization of a general dense $n$ by $n$ matrix is $O(n^3)$ regardless of the method.


Avoid using determinants in practical computations if at all possible. The question of computing even a 2-by-2 determinant in a reliable manner is quite complex.
7
On

If you have $n-1$ linearly independent vectors in $\mathbb{R}^n$, you can find a vector orthogonal to them all in pretty much the same way.

Given $\vec{x}_1$ in $\mathbb{R}^2$: $\det\begin{bmatrix}\vec{e}_1&\vec{e}_2\\x_{11}&x_{12}\end{bmatrix}$

Given $\vec{x}_1,\vec{x}_2$ in $\mathbb{R}^3$: $\det\begin{bmatrix}\vec{e}_1&\vec{e}_2&\vec{e}_3\\x_{11}&x_{12}&x_{13}\\x_{21}&x_{22}&x_{23}\end{bmatrix}$

Given $\vec{x}_1,\vec{x}_2,\vec{x}_3$ in $\mathbb{R}^4$: $\det\begin{bmatrix}\vec{e}_1&\vec{e}_2&\vec{e}_3&\vec{e}_4\\x_{11}&x_{12}&x_{13}&x_{14}\\x_{21}&x_{22}&x_{23}&x_{24}\\x_{31}&x_{32}&x_{33}&x_{34}\end{bmatrix}$

And so on. While computing the determinant, just treat the $\vec{e}_i$ as abstract symbols. But then you end up with an expression you intepret as a vector, and it is guaranteed orthogonal to all of the the $n-1$ vectors. It will be nonzero if the $n-1$ vectors are independent. I leave the proof to you and/or references.