I want to ask about vector multiplication (cross product) in $4$-d. I heard that Gram-Schmidt process is involved but I am not sure how the process is involved. The multiplication involves $3$ vectors with $4$ components. Please someone help.
2026-03-28 05:22:36.1774675356
On
Using Gram-Schmidt to compute the cross product of $3$ vectors in $\Bbb R^4$
513 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
2
On
You do it the same way as in 3D, but you do a 4*4 matrix, whose rows are the three vectors, and the vector (w, x, y, z). Then, find the determinate of them.
The usual rules apply: the exchange of any pair of vectors will produce the negative, etc, but a cyclic permutation of A, B, C, will not.
The dot product of V(A,B,C) and D gives the volume of a parallelopied of those four vectors, which is zero if any three fall in the same hedrix (2d), or all four are co-choric (in the same 3d space).
Eckmann's definition of a cross product on a finite-dimensional, positive definite, real inner product space $\def\bfv{{\bf v}}(\Bbb V, \langle\,\cdot\, , \,\cdot\,\rangle)$ (say, $\dim \Bbb V = n$) is a map $$\times: \Bbb V^r \to \Bbb V$$ for some $r \in \{1, \ldots, n\}$ satisfying \begin{align} \langle \times(\bfv_1, \ldots, \bfv_r), \bfv_a \rangle &= 0, \qquad \qquad \qquad a \in \{1, \ldots, n\} \\ \langle \times(\bfv_1, \ldots, \bfv_r), \times(\bfv_1, \ldots, \bfv_r) \rangle &= \det [\langle \bfv_a, \bfv_b \rangle] . \end{align} Here, $[\langle \bfv_a, \bfv_b \rangle]$ just denotes the $r \times r$ matrix with $(a, b)$ entry $\langle \bfv_a, \bfv_b \rangle$.
The first condition precisely asks that the cross product of vectors $\bfv_1, \ldots, \bfv_r$ be orthogonal to each of those vectors, and this imposes $r$ independent linear constraints (except in the trivial case in which the $r$ vectors are not linearly independent, in which case the cross product is zero by the second condition, and so which we henceforth assume is not the cae). In a case like ours, where $r = n - 1$, we have $n - 1$ independent linear constraints, which determines a line.
Now, since this line is determined by orthogonality constraints we can determine a unit vector $\bf u$ spanning this line by picking any vector $\bfv_n$ linearly independent from the given vectors $\bfv_1, \ldots, \bfv_{n - 1}$ whose cross product we are computing and applying the Gram-Schmidt Algorithm to the basis $(\bfv_1, \ldots, \bfv_n)$ and extracting the last unit vector. By construction, the cross product is given by $$\times(\bfv_1, \ldots, \bfv_{n - 1}) = \lambda u.$$ for some $\lambda \in \Bbb R$. Substituting into the second condition gives $$\langle \lambda {\bf u}, \lambda {\bf u} \rangle = \det [\langle \bfv_a, \bfv_b \rangle],$$ and since $\bf u$ has unit length, we have $$\lambda^2 = \det [\langle \bfv_a, \bfv_b \rangle],$$ which determines $\lambda$ up to sign. (This reflects that there are actually two $(n - 1)$-fold cross products on $\Bbb V$, each determined by a choice of orientation.) On $\Bbb R^n$, there is a canonical orientation that we usually call "right-handed", and so we can determine a choice of sign for $\lambda$ by asking that the basis $$(\bfv_1, \ldots, \bfv_{n - 1}, \times(\bfv_1, \ldots, \bfv_{n - 1}))$$ is right handed. Given a right-handed basis (like the canonical one on $\Bbb R^n$, we can regard the vectors ${\bf w}_a$ in the basis as vectors $[{\bf w}_a] \in \Bbb R^n$, in which case the right-handed condition is just that $$\det \pmatrix{\bfv_1 & \cdots & \bfv_{n - 1} & \times(\bfv_1, \ldots, \bfv_{n - 1})} > 0.$$