When I calculate a cross product of two vectors in Cartesian coordinates, I calculate something that seems like the determinant of a 2x2 matrix.
Is there any connection between the determinant and the cross product?
When I calculate a cross product of two vectors in Cartesian coordinates, I calculate something that seems like the determinant of a 2x2 matrix.
Is there any connection between the determinant and the cross product?
On
Maybe this isn't the answer you're looking for, but one expression for the determinant of a 3x3 matrix with columns $\vec v_1,\vec v_2,\vec v_3$ is $$ \vec v_1\cdot(\vec v_2\times\vec v_3) $$ You can make sense of this algebraically or geometrically (recall that the determinant is the volume of a parallelipiped whose sides are given by the three vectors).
On
This is in most every textbook. The usual definition of the cross product is the formal determinant $$\vec v\times\vec w = \left|\begin{matrix} \vec i & \vec j & \vec k \\ v_1 & v_2 & v_3 \\ w_1 & w_2 & w_3 \end{matrix}\right|.$$ As you stated in your question, you expand in cofactors along the first row.
On
This is NOT a stupid question. I am glad that you expressed your curiosity.
When I was taught the cross product of vectors in $\mathbb{R}^3$, I was given the formula \begin{equation} \begin{split} \left(\begin{array}{c}u_1\\u_2\\u_3\end{array}\right) \times \left(\begin{array}{c}v_1\\v_2\\v_3\end{array}\right) &=~ \left|\begin{array}{ccc} \mathbf{i} & \mathbf{j} & \mathbf{k}\\ u_1 & u_2 & u_3\\ v_1 & v_2 & v_3\\ \end{array}\right|\\ & \\ &=~ \underbrace{(u_2v_3 - u_3v_2)}_{\left|\begin{array}{cc}u_2 & u_3\\v_2 & v_3\end{array}\right|}\mathbf{i} + \underbrace{(u_3v_1 - u_1v_3)}_{-\left|\begin{array}{cc}u_1 & u_3\\v_1 & v_3\end{array}\right|}\mathbf{j} + \underbrace{(u_1v_2 - u_2v_1)}_{\left|\begin{array}{cc}u_1 & u_2\\v_1 & v_3\end{array}\right|}\mathbf{k}, \end{split} \end{equation} where
On
One definition of the cross product is the vector $a \times b$ such that $\langle x , a \times b \rangle = \det \begin{bmatrix} x & a & b\end{bmatrix}= \det \begin{bmatrix} x^T \\ a^T \\ b^T\end{bmatrix}$.
This is, of course, equivalent to all of the above.
To determine the $x,y,z$ components of $a \times b$ one computes $\langle e_k , a \times b \rangle$ for $k=1,2,3$ which gives, of course, exactly the same answer as the symbolic version with $x^T = ( i, j , k )^T$.
On
You can calculate the determinant of an $n\times n$ matrix using the Levi-Civita tensor. The Levi-Civita tensor $\varepsilon_{ijk}$ (in 3 dimensions) is defined as follows.
\begin{align} \varepsilon_{123}&=1\\ \varepsilon_{ijk}&=0 &\text{if}\ i=j\text{ or } j=k\text{ or } i=k\\ \varepsilon_{ijk}&=1 & \text{ if }{ijk}\text{ is an even permutation}\\ \varepsilon_{ijk}&=-1 & \text{ if }{ijk}\text{ is an odd permutation} \end{align} Here even permutation means you permute 123 an even number of times and equivalently for odd. For example 312 is even because you can get it by permuting twice: $312\rightarrow132\rightarrow123$. You then get the following formula for the determinant (in 3D) $$\det A=\sum_{i,j,k}\varepsilon_{ijk}a_{1,i} a_{2,j}a_{3,k}$$ In $n$ dimensions you would need $n$ indices. You can calculate this sum for yourself to see that it works. You can also write the cross product using the Levi-Civita tensor $$(\vec u\times\vec v)_i=\sum_{j,k}\varepsilon_{ijk}u_jv_k$$ So this is where the similarity comes from.
On
There certainly is a connection! Other answers have shown that, of course, but it goes a little deeper than that: determinants and cross products are both based on antisymmetric linear combinations of permutations.
Suppose you have two things, $a$ and $b$. There are two ways to order them, i.e. two permutations: $$\begin{gather}ab & ba\end{gather}$$ Now, if these things can be multiplied and added/subtracted, you can combine these permutations in two distinctly different ways: $$\begin{gather}ab + ba & ab - ba\end{gather}$$ The first one is called symmetric because, if you exchange the two things, its value stays the same. $$ab + ba \underset{a\leftrightarrow b} \longrightarrow ba + ab = ab + ba$$ The second one is called antisymmetric because, if you exchange the two things, it becomes the negative of itself (hence "anti"). $$ab - ba \underset{a\leftrightarrow b} \longrightarrow ba - ab = -(ab - ba)$$
If you add another thing $c$ to the set, there are now six permutations: $$\begin{gather}abc & acb & bca & bac & cab & cba\end{gather}$$ Again, there's a symmetric way to combine these, where switching any two of the elements $a$, $b$, and $c$ leaves the value unchanged: $$abc + acb + bac + bca + cab + cba$$ and there's a (totally1) antisymmetric way to combine them, where switching any two of $a$, $b$, and $c$ turns it into the negative of the original value: $$abc - acb + bca - bac + cab - cba$$ (If you have a bit of time, I'd encourage you to check all three possible swaps and verify this.)
There are, of course, other ways to add and subtract the six permutations, but none of them are totally symmetric or totally antisymmetric. (If you have a bit more time, feel free to to check all the combinations.)
And while I won't get into the details here, the antisymmetric case is particularly interesting because even if you go beyond permutations to allow repeats like $aaa$, there's still only one way to form a totally antisymmetric combination. This fact will be useful shortly.
Now what does this have to do with cross products? Well, consider this: the "ingredients" that go into a cross product are three components of the first vector $(a_1, a_2, a_3)$, three components of the second vector $(b_1, b_2, b_3)$, and three unit vectors $\hat{x}_1$, $\hat{x}_2$, and $\hat{x}_3$. If you want to make a product out of these things and have it not be "weird", hopefully it makes sense that it should probably involve multiplying a component of $a$, a component of $b$, and a unit vector.
So suppose you write out a generic formula for a product of these three things: $$a_i b_j \hat{x}_k,\quad i,j,k\in\{1,2,3\}$$ You have to choose an index ($1$, $2$, or $3$) for each of the component of $a$, the component of $b$, and the unit vector. Of course there are many different ways to make this choice, but there's one combination that will be totally antisymmetric: $$a_1 b_2 \hat{x}_3 - a_1 b_3 \hat{x}_2 + a_2 b_3 \hat{x}_1 - a_2 b_1 \hat{x}_3 + a_3 b_1 \hat{x}_2 - a_3 b_2 \hat{x}_1$$ That's a cross product. It's the unique totally antisymmetric linear combination of all possible terms that can be formed by multiplying one element of $a$, one element of $b$, and one unit vector without repeating indices.
If you think about it, it makes sense why you would want the cross product to be either totally symmetric or totally antisymmetric: if it weren't, then its value would change if you relabeled one dimension as another. You might have two vectors whose cross product is $(5, 3, 2)$ under regular coordinates, but if you changed your coordinate system to switch the first and second dimensions, without (anti)symmetry the cross product could have an entirely different value, like $(-1, 4, 1)$. A mathematical operation that depends on something totally unphysical like how you label your dimensions probably isn't very useful.
Given that way of looking at a cross product, the determinant of a $3\times 3$ matrix is almost trivially the same thing. Suppose you have this matrix: $$\begin{matrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{matrix}$$ If you choose set of three elements such that each set contains one element from each row and one element from each column, you get exactly six possible sets: $$(\{a_{11}, a_{22}, a_{33}\}, \{a_{11}, a_{23}, a_{32}\}, \{a_{12}, a_{23}, a_{31}\}, \{a_{12}, a_{21}, a_{33}\}, \{a_{13}, a_{21}, a_{32}\}, \{a_{13}, a_{22}, a_{31}\})$$ These sets, unsurprisingly correspond to the six permutations of $\{1,2,3\}$. If you always choose the first index to be in numerical order, then the ways of choosing which second index corresponds to each first index are precisely the permutations. So you can multiply each set and form an antisymmetric linear combination of those products: $$a_{11}a_{22}a_{33} - a_{11}a_{23}a_{32} + a_{12}a_{23}a_{31} - a_{12}a_{21}a_{33} + a_{13}a_{21}a_{32} - a_{13}a_{22}a_{31}$$ That's a determinant.
It makes sense for the determinant to be either totally symmetric or totally antisymmetric for much the same reason as the cross product: a matrix of this form can represent some kind of transformation on 3D vectors, in which case the three indices correspond to the three dimensions of space, and a quantity which changes in a major way when you relabel which dimension is which probably won't be very useful.
1Totally antisymmetric is the term to use when exchanging any two elements negates the expression. You can also have an expression which is partially antisymmetric, meaning that exchanging some pairs of elements reverses the sign, but not others. For example, in $$abc - acb + bca - bac - cab + cba$$ if you switch $a\leftrightarrow b$, it negates the expression, but switching $a\leftrightarrow c$ or $b\leftrightarrow c$ does not.
If $\vec{i},\vec{j},\vec{k}$ are the three basic vectors of $\mathbb{R}^3$ then the cross product of vectors $(a,b,c), (p,q,r)$ is the determinant of the matrix $$\left(\begin{array}{lll}\vec{i}&\vec{j}&\vec{k}\\ a &b & c\\ p&q &r\end{array}\right)$$ by definition. The coordinates of that vector are obtained by expanding this determinant along the first row.