On the definition of scalar multiplication for quaternionic vector spaces, in Simon, Representations of Finite and Compact Groups (Theorem II.6.4)

105 Views Asked by At

I'm studying from Simon's book Representations of Finite and Compact Groups, and I'm struggling with how the quaternionic spaces are introduced (please, correct me whenever I'm wrong; and, please, be patient with me, I'm not a mathematician…).

First, he classifies the irreps of finite groups on complex vector spaces into complex and non-complex, according to whether a representation is self-conjugate, i.e. equivalent to its complex conjugate representations (by this point he is assuming unitarity; to keep everything as simple as possible, we can stick to complex unitary matrices).

Then he shows that each non-complex representation commutes with a anti-unitary matrix $J$, with $J^2=\pm\rm I$, and calls real the (non-complex) representations with $J^2=\rm I$, quaternionic those with $J^2=-\rm I$ (the two cases being mutually exclusive).

Then he explains why he used the term “real”: on a suitable orthonormal basis a real representation is made up of real matrices. Now he is going to address “quaternionic”. This is the passage in which quaternionic linear structures are introduced for the very first time:

Simon, B. (1995). Representations of Finite and Compact
Groups

What I really do not get here is the definition of scalar multiplication. From the way it is written, I thought he wanted the quaternionic space to be a right vector space over the division ring $\mathbb H$. [*]
But then shouldn't it be:

$$(x\gamma_1) \gamma_2=x (\gamma_1 \gamma_2)$$

for a right multiplication?, it seems to me that with his definition we end up with a left multiplication,

$$(x\gamma_1) \gamma_2=x (\gamma_2 \gamma_1)\,.$$

So what is he trying to do here?, is he defining a left scalar multiplication while denoting it with the scalar on the right?, if yes, why?, for what purpose?, what am I missing here?

As far as I understand, the author has defined a scalar multiplication, the quaternion $\alpha + \beta j$ and the vector $x$ being mapped into $(\alpha+\beta J)x$, and this brings about a (I'd say left) quaternionic linear structure. The starting complex representation $U$ is then quaternionic homogeneous with respect to such product, hence quaternionic linear, hence a representation on a quaternionic vector space. Isn't this enough to prove (that part of) the claim?


[*] As far as I understand, the request of right scalar multiplication in quaternionic spaces is made in order to preserve the usual matrix tools for computing the action of linear operator. In essence, say $V$ and $W$ are quaternionic vector spaces, with bases $(e_i)$ and $(\epsilon_\alpha)$ respectively, and say $A\colon V\to W$ is quaternionic linear; when all multiplications are right:

$$A(x)=A(e_i x^i)=\epsilon_\alpha A^\alpha_i x^i\,,$$

so the matrix is on the left, the vector on the right, and one can stick to row by column products; in case of left multiplications, some adjustments are needed.

Should there be a more profound reason to ask for a right multiplication, I'm afraid I'm totally missing the point.

1

There are 1 best solutions below

4
On

Perhaps wrapping up the questioner's and @CaptainLama's comment into a "more durable" "answer":

Yes, the right-multiplication action of a quaternion algebra $H$ on column vectors with entries in $H$ is a right action, which in coordinate-wise terms means $(va)b=v(ab)$ (as matrices, etc.)

Yes, this can be converted to a left structure by using the standard/natural involution on the quaternion algebra. But, as observed, this entails non-commutativity with the left action of matrices with entries in $H$.

Um, so, don't convert this to a left action, regardless of the availability of an involution to flip left-and-right. After all, we surely want these ideas to apply to other division algebras that don't necessarily have involutions. (Involutions give isomorphisms to their "opposite algebras".)

So, just settle for the "right" vector space structure by $H$ on column vectors, and then certainly left multiplication by matrices with entries in $H$ commutes with the right action of $H$.

If we (misguidedly, but out of curiosity?) go down the path of asking about left matrix multiplications that commute with left multiplication by $H$, we do not get matrices with entries in $H$, but only matrices with entries in the center of $H$...

So, well, it depends what we want. And, yes, often, we misrepresent what we want, because we'll not really use the misrepresented version anyway. :)

(And, yes, if we want to avoid the significantly-meaningless left/right business, since it is mostly notational, we can talk about $H^{\mathrm opp}$ modules' endomorphism rings...)