Let $$ S = \left\{ \left(\begin{matrix} a & a \\ a & a \\ \end{matrix}\right) \ \middle| \ a \in \mathbb{R} \right\}. $$ and $\cdot$ be a usual matrix multiplication operation.
Then is $(S,\cdot)$ a group or semigroup or monoid ?
Matrices in group theory
2.8k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
It is a semigroup since it is closed under matrix multiplication, and associative since all matrix multiplication is associative.
It has identity element $\frac12\begin{bmatrix}1&1\\1&1\end{bmatrix}$, so it is also a monoid.
It can't be a group, however, since $\begin{bmatrix}0&0\\0&0\end{bmatrix}$ has no inverse.
Actually, this set is a field isomorphic to $\mathbb R$! If you remove the zero element, you can conclude that it's a group under multiplication... a linearly ordered group, in fact.
On
Your confusion, as you say, comes from the fact that you have matrices that are at the same time singular, and invertible (in a non-ordinary sense).
But it's not secondary what's in parentheses.
Equivalently: you have matrices not invertible in the ordinary sense and invertible in a non-ordinary sense. Does it sound less problematic now?
I think that your real problem has to do with a context mix.
When you extract abstract algebraic structure from a concrete set it can happen that the structure has something called identity (indeed also more than one). For instance in a ring there can be a multiplicative identity, also called unity, while there is an additive identity, also called zero.
With an $n\times n$ real matrix it is possible to
- multiply a matrix by a real $n$-component vector (there is an identity matrix $\mathrm{id}_{\mathbb{R}^n}$, in the sense of an operator that left unaltered any vector)
- add a matrix to a matrix
- multiply a matrix by a matrix
- multiply a matrix by a real
The set of all real $n\times n$ matrices has from the last three operations a rich algebraic structure, that of associative algebra, in which you can find the multiplicative identity ($I$) and the additive identity ($O$). The first operation instead let these square matrices act on the space of real $n$-component vectors, with the multiplicative identity element ($I$) that corresponds to the identity operator $\mathrm{id}_{\mathbb{R}^n}$.
But this last point is a theorem not a definition.
Now in the set you are considering $S$ you are not supposing that operation 1. is allowed. Anyway take it for granted and be it the ordinary multiplication of matrix by vector.
Now you see that to the multiplicative identity there correponds an operator that is not an identity operator when applied multiplicatively to any vector. \begin{equation} \begin{bmatrix} 1/2&1/2\\ 1/2&1/2 \end{bmatrix} \begin{bmatrix} x\\ y \end{bmatrix} =\frac{x+y}{2} \begin{bmatrix} 1\\ 1 \end{bmatrix} \neq \begin{bmatrix} x\\ y \end{bmatrix} \end{equation}
And to the multiplicative inverse of an element of $S$ there corresponds an operator that is not the inverse (in the sense of the operator) of the operator represented by the element, simply because this does not exist. For that same reason the singularity of all the operators in $S$ and the fact of an element of $S$ has a multiplicative inverse element have nothing in common.
So you thought that the multiplicative invertibility of an element of $S$ was correlated to the invertibility (in the sense of the operator) represented by this element of $S$, but this is not the case because this representation does not preserve the identity and inverses.
We know that matrix multiplication is associative. We need to check that $S$ is also closed under this operation. Indeed, $$ \begin{pmatrix} a & a \\ a & a \end{pmatrix} \begin{pmatrix} b & b \\ b & b \end{pmatrix} = \begin{pmatrix} 2ab & 2ab \\ 2ab & 2ab \end{pmatrix} $$ shows that it is closed under multiplication. Therefore $S$ is at least a semigroup.
To check whether $S$ is a monoid, we need to see if there is an identity element. If $$ \begin{pmatrix} e & e \\ e & e \end{pmatrix} $$ is to be an identity element, then we would need that $$ \begin{pmatrix} e & e \\ e & e \end{pmatrix} \begin{pmatrix} a & a \\ a & a \end{pmatrix} = \begin{pmatrix} a & a \\ a & a \end{pmatrix} = \begin{pmatrix} a & a \\ a & a \end{pmatrix} \begin{pmatrix} e & e \\ e & e \end{pmatrix} $$ holds for every $a$. This is equivalent to requiring that $$ \begin{pmatrix} 2ea & 2ea \\ 2ea & 2ea \end{pmatrix} = \begin{pmatrix} a & a \\ a & a \end{pmatrix} = \begin{pmatrix} 2ae & 2ae \\ 2ae & 2ae \end{pmatrix} $$ holds for every $a$, which occurs iff $a = 2ae$ and thus iff $e=1/2$. Thus we have an identity element $$ \begin{pmatrix} 1/2 & 1/2 \\ 1/2 & 1/2 \end{pmatrix}, $$ so $S$ is at least a monoid.
Finally, to check whether $S$ is a group, we need to see if there are inverses. Given a matrix $$\begin{pmatrix} a & a \\ a & a \end{pmatrix}$$ in $S$, its inverse $$\begin{pmatrix} i & i \\ i & i \end{pmatrix}$$ must satisfy $$ \begin{pmatrix} a & a \\ a & a \end{pmatrix} \begin{pmatrix} i & i \\ i & i \end{pmatrix} = \begin{pmatrix} 1/2 & 1/2 \\ 1/2 & 1/2 \end{pmatrix} = \begin{pmatrix} i & i \\ i & i \end{pmatrix} \begin{pmatrix} a & a \\ a & a \end{pmatrix}. $$ Such an inverse exists iff $2ai=1/2$, which occurs iff $a\ne0$ and $i=1/(4a)$. This reveals that the element $$ \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix} $$ does not have an inverse, so $S$ is not a group.
In conclusion $S$ is a semigroup and a monoid, but not a group.