Show that $V_2 = V_1 \otimes V_1$

112 Views Asked by At

I confused myself slightly trying to represent the quaternions $x = a_1 \mathbf{1} + a_2 \mathbf{i} +a_3 \mathbf{j} + a_4 \mathbf{k}$. This can be represented as $2 \times2$ matrix with complex entries:

$$ \left[ \begin{array}{cr} a_1 + a_2 i & -a_3 + a_4 i \\ a_3 + a_4 i & a_1 - a_2 i \end{array} \right] \stackrel{\det}{\to} a_1^2 + a_2^2 + a_3^2 + a_4^2 $$

but then I said this is too complicated and tried to represent this as a $4 \times 4$ matrix over integers:

$$ \left[ \begin{array}{rrrr} a_1 & -a_2 & -a_3 & -a_4 \\ a_2 & a_1 & a_4 & -a_3 \\ a_3 & -a_4 & a_1 & a_2 \\ a_4 & a_3 & -a_2 & a_1 \end{array} \right] \stackrel{\det}{\to} a_1^4 + a_2^4 + a_3^4 + a_4^4 + (\dots) $$

This determinant is a quartic. And let's check to make sure:

>>> from sympy import *
>>> a = symbols('a')
>>> b = symbols('b')
>>> c = symbols('c')
>>> d = symbols('d')
>>> x = Matrix([ [a,-b,-c,-d],[b,a,d,-c],[c,-d,a,b],[d,c,-b,a] ])
>>> x
Matrix([
[a, -b, -c, -d],
[b,  a,  d, -c],
[c, -d,  a,  b],
[d,  c, -b,  a]])
>>> x.det()
a**4 + 2*a**2*b**2 + 2*a**2*c**2 + 2*a**2*d**2 + b**4 + 2*b**2*c**2 + 2*b**2*d**2 + c**4 + 2*c**2*d**2 + d**4

The computer has done the combinatorics and the bookkeeping for us. How did the determinant become a square? The first one is $V_1 \in M_{2\times 2}(\mathbb{C})$ and the second one is $V_2 \in M_{4\times 4}(\mathbb{R})$. How can it be that $(\det V_1)^2 = \det V_2$?

It looks like we have $V_2 = V_1 \otimes V_1$. Could these be the regular representation or the fundamental reresentation? I mix up these terminologies, as too generic.

1

There are 1 best solutions below

0
On

No, $V_2$ is clearly not the Kronecker product of $V_1$ with itself, as we can easily check by hand - in particular, the $4\times4$ matrix $V_1\otimes V_1$ has entries which are quadratic in the unknowns, unlike $V_2$, and $V_1\otimes V_1$ would be a $4\times4$ complex matrix rather than a real matrix like $V_2$.

Moreover, Kronecker product does not satisfy the identity

$$ \det(A\otimes B)=\det(A)\times\det(B). $$

Rather, the block diagonal matrix $A\oplus B$ does:

$$ \det(A\oplus B)=\det(A)\det(B). $$

If $A$ is $m\times m$ and $B$ is $n\times n$ then the Kronecker product satisfies

$$ \det(A\otimes B)=\det(A)^n\det(B)^m. $$

(Recall if $A,B$ are linear maps on $V,W$ respectively then there's a well-defined linear map $A\otimes B$ defined on the tensor product $V\otimes W$ by the formula $(A\otimes B)(v\otimes w)=Av\otimes Bw$, and if one picks bases for $V,W$ it induces a basis for $V\otimes W$ and the matrix for the tensor product $A\otimes B$ is the Kronecker product of the matrices for $A$ and $B$.)

Quaternions can be represented by $2\times 2$ complex matrices because $\mathbb{H}$ is a two-dimensional (right) complex vector space and for each quaternion $q$ there is a corresponding "left-multiplication-by-$q$" map which is $\mathbb{C}$-linear.

In general, we can consider a vector space $V$ over a field $L$, and a linear map $A$ on $V$. If $L/K$ is a field extension, say of degree $d$, we can also consider $V$ as a vector space over $K$, and so we can take the determinant of $A$ with respect to $L$ or $K$. Call these $\det_L(A)$ and $\det_K(A)$. Furthermore, $L$ is itself a vector space over $K$, and so every element $a$ of $L$ has a "multiplication-by-$a$" map which is a $K$-linear transformation of $L$, whose determinant is called the norm $N_{L/K}(a)$.

Theorem. $~\det_K(A)=N_{L/K}(\det_L(A))$.

In the event that $a=\det_L(A)$ is already a scalar in $K$, we have $N_{L/K}(a)=a^d$. It just so happens that multiplication by a quaternion affords a $2\times2$ complex matrix with real determinant (this follows from the fact quaternions have polar forms and the "accidental isomorphism" $\mathrm{Sp}(1)\cong\mathrm{SU}(2)$), so you get a determinant squared in your case since $[\mathbb{C}:\mathbb{R}]=2$.

More generally, consider $A$ any $nd\times nd$ block matrix with $n^2$-many $d\times d$ blocks, and where each of the $n^2$ blocks commute (i.e. $A_{ij}A_{k\ell}=A_{k\ell}A_{ij}$ for any blocks $A_{ij}$ and $A_{k\ell}$): then the formula for the determinant of an $n\times n$ matrix $[a_{ij}]$ (so, Leibniz formula) makes sense with the block matrices $A_{ij}$ in place of matrix entries $a_{ij}$ - we can call this the "block determinant." So the so-called block determinant ${\rm bldet}(A)$ is itself a $d\times d$ matrix. Then we have:

Theorem. $~\det({\rm bldet}(A))=\det(A)$

The first theorem is a corollary to this one: pick a basis for $L/K$, so a matrix in $M_n(L)$ becomes a matrix from $M_{nd}(K)$, and the blocks commute since they represent multiplication-by-$a$ as a $K$-linear operator on $L$ for various elements $a\in L$, and $L$ is commutative (being a field).