If $A^2 = B^2$, then $A=B$ or $A=-B$

4.7k Views Asked by At

Let $A_{n\times n},B_{n\times n}$ be square matrices with $n \geq 2$. If $A^2 = B^2$, then $A=B$ or $A=-B$.

This is wrong but I don't see why. Do you have any counterexample?

7

There are 7 best solutions below

0
On BEST ANSWER

$$A=\begin{pmatrix}0&1\\0&0\end{pmatrix}\quad; \quad B=\begin{pmatrix}0&0\\1&0\end{pmatrix}$$ and the generalization is easy: it suffices to take two nilpotent matrices with order of nilpotency $2$.

2
On

$$\begin{pmatrix} 0 & 1 \\ 0 & 0\end{pmatrix},\; \begin{pmatrix} 0 & 0 \\ 0 & 0\end{pmatrix}$$

0
On

$A = I_2$, $B = \begin{bmatrix}0 & 1\\1 & 0\end{bmatrix}$

Then $A^2 = B^2 = I_2$, but $A \ne \pm B$.

Edit I
In general, you can take any two distinct reflection matrices (the ones in Darth Geek's answer, for example). Any such matrix $R$ will have $R^2 = I$ (for they are orthogonal and symmetric). You can find plenty of any order. See for example: https://en.wikipedia.org/wiki/Householder_transformation.

Edit II
If $A$ and $B$ are symmetric, positive definite matrices, then $A^2 = B^2$ implies $A = B$.

Since $A$ and $B$ are real and symmetric, there exist orthogonal matrix $P$ and $Q$ such that $PAP^T$ and $QBQ^T$ are diagonal matrices. Also, positive definiteness implies that the eigenvalues of $A$ and $B$ are the positive square roots of the eigenvalues of $A^2 = B^2$ respectively, and hence both $A$ and $B$ have the same eigenvalues. Hence, we can assume that $P$ and $Q$ are such that $PAP^T = QBQ^T = \Lambda$ is diagonal. But now, $QB^2 Q^T = \Lambda^2$ implies that $QA^2Q^T = \Lambda^2$ (since $A^2 = B^2$). Finally, observe that any eigenvector of $A^2$ corresponding to an eigenvalue $\lambda^2$ is also an eigenvector of $A$ corresponding to the eigenvalue $\lambda$ (since $A^2 x = \lambda^2 x \implies Ax = \lambda^2 (A^{-1} x) = \lambda x$). Therefore, $QA^2Q^T = \Lambda^2$ implies that $QAQ^T = \Lambda = QBQ^T$, and hence $A = B$.

Note: If we assume that $A$ is symmetric positive semidefinite and $B$ is symmetric positive definite, then $A^2 = B^2$ implies that $A$ is also positive definite, as it cannot have $0$ as an eigenvalue.

0
On

If $a^{2}-b^{2}=0$ in e.g. $\mathbb{R}$ then we have $\left(a-b\right)\left(a+b\right)=a^{2}-b^{2}=0$ and consequently $a=b\vee a=-b$.

When it comes to matrices then $\left(A-B\right)\left(A+B\right)=A^{2}+AB-BA-B^{2}$ where $AB$ and $BA$ are not necessarily the same.

Secondly $UV=0$ does not necessarily imply that $U=0\vee V=0$

For counterexamples see the other answers.

1
On

$A = \begin{pmatrix} 1/2 & 1/4 \\ 3 & -1/2\end{pmatrix}$, $B = \begin{pmatrix} 1 & 0 \\ 0 & 1\end{pmatrix}$

0
On

Let $a,b\in Sym_{n}$ such that $a^{2}=b^{2}$. Let $A,B$ be permutation matrix corresponding to $a,b$. Then $A^{2}=B^{2}$.

0
On

About the drhab post.

If $A$ is a generic matrix (one can simulate this property by randomly choosing $A$), then $A$ has distinct complex non-zero eigenvalues $(\lambda_i)_i$ s.t. if $i\not=j$ then $\lambda_i^2\not=\lambda_j^2$. Thus we may assume that $A=diag((\lambda_i)_i)$ and $A^2=diag(\mu_1,\cdots,\mu_n)$ where the $(\mu_i)_i$ are distinct. $B$ commute with $A^2$ ; then $B=diag(\alpha_1,\cdots,\alpha_n)$ where , for every $i$, $\alpha_i=\pm \lambda_i$. There are exactly $2^n$ solutions in $B$ ; in particular $AB=BA$.