The context is, I am trying to create a program to calculate the gradient of a discretized surface, i.e. a mesh. The trick however is, I need to do this on arbitrary dimensions.
I was given this formula:
That numerator is all fine and dandy in 3D, since to get the orthognal of that line you can just define the $\perp$ operator as $N\times (x_k - x_j)$, where $N$ is the normal to the triangle.
This however relies on the cross product, which does not exist in most dimensions. If instead of a mesh I have a hyper mesh, how can I get a vector that is both orthogonal to the hyper edge and contained in the hyper face?
To clarify, the diagram I have posted is one of the key steps needed to compute a discrete gradient on a 3D mesh (i.e a 2D manifold). In that setting the formula I posted is well defined through the operator I described earlier. It will give you a vector that is both contained in the triangle and orthogonal to the edge.
Consider now the same pattern one dimension above, i.e. a 3D manifold embedded in a 4D space. Each hyper face is a tetrahedron in 4D. The extension of the above formula would be; to find a vector that is embedded in the 3D subspace spanned by the vectors of the tetrahedron; and orthogonal to a vector that is orthogonal to all the vectors in that subspace (the hyper normal $N$).
One of the answers to this question describes a beautiful way to calculate $N$, however that's only half the problem, we need a vector that is orthogonal to $N$ and to one of the faces of the tetrahedron (in the example).

This answer contains a solution which has the following properties
Let $x_1, x_2, \dots, x_k$ denote $k$ vectors in $\mathbb{R}^n$ with $k \leq n$ and $X \in \mathbb{R}^{n \times k}$ denote the (tall) matrix whose $j$th column is $x_j$. Let $X = QR$ denote a QR factorization of $X$. Then $Q \in \mathbb{R}^{n \times n}$ is orthogonal, i.e. $Q^TQ=I_n$ is the $n$-by-$n$ identity matrix and $R \in \mathbb{R}^{n \times k}$ is upper triangular. It follows that $$ V = \text{span}\{x_1, x_2, \dotsc, x_k\} \subseteq \text{span}\{q_1, q_2, \dotsc, q_k\}$$ where $q_j$ is the $j$th column of $Q$. Moreover, if $k<n$, then $q_{k+1}, q_{k+2}, \dotsc q_n$ are all perpendicular to $V$ which is more than what is required here.
The QR factorization can be computed using the modified Gram-Schmidt method with or without column pivoting, using Givens rotations or using Householder reflectors. LAPACK implements Householder reflectors. The use of orthogonal transformations avoids many numerical issues which appear in real applications. The cost of doing a QR factorization of a general dense $n$ by $n$ matrix is $O(n^3)$ regardless of the method.
Avoid using determinants in practical computations if at all possible. The question of computing even a 2-by-2 determinant in a reliable manner is quite complex.