Prove statement on matrices.

81 Views Asked by At

Let $A_i \in \operatorname{Mat}_{n \times n}(\mathbb{R})$, $A_i = (A_i)^T$ for $i \in \{1, \dots, n\}$. Let $\sum_{i = 1}^{n}{A_i} = E$ (identity), $\sum_{i = 1}^{n}{\operatorname{rk} (A_i)} = n$. How one can prove, that $\forall i \neq j A_i \times A_j = 0$?

1

There are 1 best solutions below

8
On BEST ANSWER

Since $\sum\limits_{i = 1}^{n}{\operatorname{rk} (A_i)} = n$ and $\sum\limits_{i = 1}^{n}{A_i} = E$, we see that $\mathrm{im} A_{i} \cap \mathrm{im} A_{j} = 0$ for $i\neq j$. Since $A_i$ is symmetric, it is diagonal in some basis $v_1, ..., v_n$. If $\mathrm{rk} A_i = k$, then there are $k$ vectors $v_1, ..., v_k$ for which $Av_i = \lambda_i v_i$ with nonzero $\lambda_i$, Let's choose them for each matrix $A_i$, together getting $n$ vectors. I claim they are linearly independend, indeed: their span contains image of $A_1 + ... + A_n$, which is whole $\mathbb{R}^{n}$. Now, how the matrices look in this basis? $$ A_i= \left[ \begin{array}{c|c|c} 0 & 0 & 0\\ \hline \mathrm{something} & \mathrm{diag}(\lambda_1, ..., \lambda_k) & \mathrm{something} \\ \hline 0 & 0 & 0 \end{array} \right] $$ And for every $i$ nonzero rows are different. But all $A_i$ sum up to $E$, so $\mathrm{something} = 0$ in every matrix, and all the lambdas are $1$, and such matrices commute.

The previous argument was the following and it lacks reasoning as was pointed out in the comments: Fix $i$, since $A_{i}$ is symmetric, it is diagonal in some basis, with nonzero diagonal entries going first (let's say there are $k$ of them) and zero entries going afterwards (thus everything below first $k$ rows is $0$). In such a basis all the other $A_{j}$ will have to have first $k$ rows being $0$, otherwise their image would intersect with image of $A_{i}$. But then $A_{i}A_{j} = 0$ trivially.