Relationship between $A$ and $\frac{1}{2}(A + A^T)$?

104 Views Asked by At

Is there a general Relationship between the determinant (or eigenvalues/-vectors) of a square matrix $A$ and its "symmetrification" $\frac{1}{2}(A + A^T)$?

1

There are 1 best solutions below

2
On

Let $\mathbb K$ be a field. Denote by $\mathbb K^{n\times n}$ the space of matrices, by $\mathbb K_\mathrm{sym}^{n\times n}$ the space of symmetric matrices and by $\mathbb K_\mathrm{antisym}^{n\times n}$ the space of antisymmetric matrices (also denoted $\mathfrak{o}_n(\mathbb K)$). Given a matrix $A\in\mathbb K^{n\times n}$, you get a decomposition into a symmetric part $A_\mathrm{sym}$ and an antisymmetric part $A_\mathrm{antisym}$: $$A=A_\mathrm{sym}+A_\mathrm{antisym} =\underbrace{\frac{1}{2}(A+A^T)}_{\in\mathbb K_\mathrm{sym}^{n\times n}}+\underbrace{\frac{1}{2}(A-A^T)}_{\in\mathbb K_\mathrm{sym}^{n\times n}},$$ hence a decomposition: $$\mathbb K^{n\times n}=\mathbb K_\mathrm{sym}^{n\times n}+\mathbb K_\mathrm{antisym}^{n\times n}.$$ (The sum is direct iff $\operatorname{char}\mathbb K\neq 2$ (so $1+1\neq 0$ in $\mathbb K$): A symmetric and antisymmetric matrix $A$ has to fulfill $A=A^T=-A$, hence $(1+1)A=0$, so $A=0$ if $\operatorname{char}\mathbb K\neq 2$ and the identity matrix $E_n$ is symmetric and antisymmetric if $\operatorname{char}\mathbb K=2$.)

The symmetrification $A\mapsto A_\mathrm{sym}$ is therefore a projection onto a vector subspace, which results in a loss of information. As Thomas Andrews already mentioned, the projection on the zero matrix is a full loss of information: Any antisymmetric matrix $A=B-B^T\propto B_\mathrm{antisym}$ fulfills $A_\mathrm{sym}=0$, hence you cannot deduce even a single information about $A$ from $A_\mathrm{sym}$, that can differ for antisymmetric matrices.

It is possible the other way around: As Thomas Andrews already mentioned as well, their traces (which does not differ for antisymmetric matrices, whose trace always vanishes) are the same: $$\operatorname{tr}(A_\mathrm{sym}) =\frac{1}{2}\operatorname{tr}(A+A^T) =\frac{1}{2}(\operatorname{tr}(A)+\operatorname{tr}(A^T)) =\frac{1}{2}(\operatorname{tr}(A)+\operatorname{tr}(A)) =\operatorname{tr}(A)$$ and therefore the sum of their eigenvalues are as well. Concerning determinants (which behave nicely when used together with multiplication, not addition like the trace), eigenvalues and eigenvectors, my intuition would go towards nothing being possible in that direction, as all of them will probably differ a lot when considering the affine vector space $A_\mathrm{sym}+\mathbb K_\mathrm{antisym}^{n\times n}$, that is projected down onto $A_\mathrm{sym}$. To find some examples, you could find the following lemma helpful:

Given an orthogonal matrix $A\in\mathbb K^{n\times n}$, then: \begin{align*} &2^{2n}\det(A_\mathrm{sym})\det(A_\mathrm{antisym}) =\det(A+A^T)\det(A-A^T) =\det\left((A+A^T)(A-A^T)\right) \\ =&\det(A^2+\underbrace{A^TA-AA^T}_{=E_n-E_n=0}-(A^T)^2) =\det\left(A^2-(A^2)^T\right) =2^n\det\left((A^2)_\mathrm{antisym}\right), \end{align*} hence: $$2^n\det(A_\mathrm{sym})\det(A_\mathrm{antisym}) =\det\left((A^2)_\mathrm{antisym}\right).$$ (This is only non-trivial for even $n$ as for odd $n$, the determinant of antisymmetric matrices always vanishes.) You can for example play a bit using complex structures, which are orthogonal and antisymmetric in particular and give the negative identity when squared.