Give a $2\times2$ real matrix, without zero elements satisfying following equations

57 Views Asked by At

A $2\times2$ real matrix without zero elements named $A$, $B$, $C$ and $D$ that satisfies following equations:

  • $A^2 = -I$
  • $B^2 = 0$
  • $CD = -DC$ with $CD\neq0$

    I couldn't come up with any thing, Can you?
2

There are 2 best solutions below

3
On BEST ANSWER

$\newcommand{\m}[1]{\begin{bmatrix} #1\end{bmatrix}}$

For all three questions one can give a "conceptual" solution, that is a solution with an explanation of why it should work, as opposed to just solving a bunch of equations, except for the "non-zero entries" part. In each case it seems incredibly likely that the solution can be modified to one with non-zero entries by conjugating with a generic invertible matrix.

For $A^2=-I$: In the plane, multiplying a vector by $-1$ is the same as rotating it through an angle $\pi$, which in turn is the same as rotating through an angle $\pi/2$ twice. So: Let $A_\theta=\m{\cos(\theta)&-\sin(\theta)\\\sin(\theta)&\cos(\theta)}$. Then $A_{\pi/2}^2=A_\pi=-I$, but $A_{\pi/2}$ has zero entries. Let $A=PA_{\pi/2}P^{-1}$ for a "random" invertible matrix $P$; if that doesn't work try a different $P$.

(Note I don't think that saying "keep trying til you find a $P$ that works" really negates my claim to be avoiding random diddling around here, because, not that I've proved it, it seems clear that "almost every" $P$ in any of various senses should work.)

FOr $B^2=0$: Both eigenvalues of $B$ must be $0$. The obvious example of a non-zero matrix with eigenvalues $\lambda=0,0$ is $\m{0&1\\0&0}$. This time I actually verified that if $B=P\m{0&1\\0&0}P^{-1}$ where $P=\m{1&1\\1&2}$ then indeed $B$ has no zero entries.

For $CD=-DC\ne0$: Hmm. If $C$ happens to be invertible this says that $-D$ is similar to $D$, hence $D$ and $-D$ have the same eigenvalues. So the obvious candidate is $D=\m{1&0\\0&-1}$.

Now you think about this, and you realize that for that particular $D$ you can obtain $-D$ from $D$ by swapping the rows and then swapping the columns. Or to put the same thing in terms of concepts instead of calculations: If $D$ is the mmatrix for a linear transfomation with respect to the ordered basis $(b_1,b_2)$, then $-D$ is the matrix for the same transformation with respect to the basis $(b_2,b_1)$; so the "similarizer" should be the map $x\to(x_2,x_1)$. Sure enough, if $C=\m{0&1\\1&0}$ then $CD=-DC\ne0$. So if $P$ is a generic invertible matrix then $D'=PDP^{-1}$, $C'=PCP^{-1}$ should work.

4
On

We have $$A=\begin{pmatrix}a&b\\c&d\end{pmatrix}\implies A^2=\begin{pmatrix}a^2+bc&b(a+d)\\c(a+d)&d^2+bc\end{pmatrix}=\begin{pmatrix}-1&0\\0&-1\end{pmatrix}$$ Notice that $$a^2+bc=d^2+bc\implies a^2=d^2$$ and you should now hopefully be able to do the rest.