Looking for an alternative and easier way to prove Sylvester's theorem

251 Views Asked by At

My professor provided two enunciates of Sylvester's Theorem, the first is the following:

Two symmetric n × n matrices B and C are congruent if and only if the diagonal representations for B and C have the same rank, index and signature.

He didn't give a proof of this enunciate but he did give a proof of the following version of Sylvester's Theorem:

Given a scalar product $\langle \cdot \rangle$ on the euclidean space V, there exists a basis {$v_1,...,v_n$} such that the matrix of V with respect to this basis is

\begin{pmatrix} I_p & 0 & 0 \\ 0 & -I_q & 0 \\ 0 & 0 & 0 \\ \end{pmatrix}

However, the proof is a bit "overwhelming" and with too many passages, making it quite hard (at least for me) to be able to do it on my own since I get often stuck, so I wondered if any of you knew a more comprehensible way to prove this second enunciate? I tried looking online but this enunciate is apparently very rare to find in books and whatnot...

1

There are 1 best solutions below

2
On BEST ANSWER

I don’t know if it is an alternative proof, maybe is the standard one:

If your scalar product is always zero, then you have done. In this case $p=q=0$ and $n=\dim V$, where $n$ is the nullity of the scalar product.

If it is not always zero, then there exists a vector $v\neq 0$ such that $\langle v,v\rangle \neq 0$. Dividing $v$ by $\sqrt{|\langle v,v \rangle |}$, we can assume that $\langle v, v\rangle=\pm 1$ by simplicity.

Let’s prove the statement by induction on the dimension of $V$. If the space is one dimensional, our vector $v$ is the basis that we are looking for.

Suppose that the statement is true for dimensions $k<\dim V$. We want to claim that it is true also for $\dim V$. Consider

$v^\perp=\{w : \langle v, w\rangle =0 \}$.

This one is a proper subspace (because $\langle v,v\rangle\neq 0$ ) of $V$ of dimension $k:=\dim V -1$, so by inductive hypothesis $\langle \cdot \rangle_{|_{v^\perp}}$ admits a basis $w_1, \cdots w_k\in v^\perp$ such that the associated matrix in that basis is

$$\begin{pmatrix} I_{p} & 0 & 0 \\ 0 & -I_{q} & 0 \\ 0 & 0& 0 \end{pmatrix}$$

Then $\{v, w_1,\dots w_k\}$ is a basis of $V$ such that the associated matrix of $\langle \cdot \rangle$ in that basis is

$$\begin{pmatrix} I_{p} & 0 & 0&0 \\ 0 & -I_{q} & 0 &0\\ 0 & 0& \pm 1 & 0 \\ 0 & 0 & 0 & 0 \end{pmatrix}$$

If you have some questions, let me know and we can discuss :) .

Remark: The dimension of $v^\perp$ is $\dim V -1$ because $V=v^\perp \oplus \langle v\rangle$. This holds because for any $w\in V$

$$w=(w-\frac{\langle w,v\rangle}{\langle v,v \rangle} v)+ \frac{\langle w,v\rangle}{\langle v,v \rangle} v$$ and $\frac{\langle w,v\rangle}{\langle v,v \rangle}v\in \langle v\rangle $, $w-\frac{\langle w,v\rangle}{\langle v,v \rangle} v\in v^\perp$