A symmetric matrix of rank $r$ is a sum of $r$ symmetric matrices of rank $1$

793 Views Asked by At

I proved that a rank $r$ matrix can be decomposed into a sum of $r$ rank $1$ matrices in a unique way up to basis change. So I am trying to prove the same problem for symmetric matrices. Let's consider $n\times n$ matrices. I know that $$ (\sum_{i=i}^{l} e_i)^t(\sum_{i=i}^{l} e_i), $$ where $l\le n$, gives symmetric matrices that have rank $1$, and more than that, they span the vector space of symmetric matrices (and when $l=2$ it forms a basis), so I know that I can indeed write a given symmetric matrix as sum of rank $1$ symmetric matrices, the problem is to show that I can do it with at most $r$ matrices.

For the first problem, the idea was that for a given matrix $A$, we would just put the space of matrix in a basis that $A$ would be $r$ independent rows, and all the other rows would be $0$, I guess that the problem for symmetric matrices could be solved by a similar idea of a nice basis change, but I can't see how I want the matrix to look like.

1

There are 1 best solutions below

0
On BEST ANSWER

Recall the structure theorem for bilinear symmetric forms on finite dimensional vector spaces: if $b$ is such a form on $V$, then there is an “orthogonal” (for $b$) decomposition $V=\bigoplus_i{A_i} \oplus \bigoplus_j{B_j}$, where the $A_i$ are lines, and the $B_j$ are planes, and there is a basis $(b^1_j,b^2_j)$ of $B_j$ such that $b(b^1_j,b^2_j)=1$, $b(b^1_j,b^1_j)=b(b^2_j,b^2_j)=0$.

This entails that if $S$ is a symmetric matrix, there is an invertible matrix $P$ such that $P^TSP=\Delta$ is block diagonal, with $s$ blocks $1 \times 1$, and $b$ blocks $\beta=\begin{bmatrix} 0&1\\1&0\end{bmatrix}$, hence $S$ has rank $r+2b$, where $r$ is the number of nonzero diagonal coefficients.

So it is enough to prove the theorem for $\beta$. Well, $\beta=\begin{bmatrix}1 & 1/2 \\ 1/2 & 1/4\end{bmatrix}+\begin{bmatrix} -1 & 1/2 \\ 1/2 & -1/4\end{bmatrix}.$