Different proof of spectral theorem for self-adjoint operators

115 Views Asked by At

Using the Spectral Theorem for quadratic forms, according to which for every quadratic form in a euclidean vector space there is an orthonormal basis which diagonalize it, how could one prove that every hermitian form in a hermitian vector space is diagonalizable? I already know the standard proof of the spectral theorem, but I am specifically asked to do it this way and don't even know where should I start from. Thanks in advance for any tips.

1

There are 1 best solutions below

0
On

You may or may not find this approach helpful. It is a generalization of how you can diagonalize a symmetric matrix by congruence over an an arbitrary field of characteristic $\ne2$ to the case of an $n \times n$ Hermitian matrix $A$ over $\mathbb C.$ What you do is find a succession of matrices $P_1,...,P_m$ ,each of which has entirely 1's on the leading diagonal and all other entries 0 except for exactly one row, such that $$(P_1...P_m)^*A(P_1...P_m)$$ is real diagonal where '*' is the conjugate transpose operator. Identify the Hermitian matrix $A$ with the Hermiian form $X^*AX$ $$X=\begin{bmatrix}X_1\\,\\.\\.\\X_n\end{bmatrix}$$. Make a sequence of 'change of variables' $X=P_1X',X'=P_2'',etc$ At each step, 'complete the square' in variable $i$ if there is any non-zero diagonal term in $(P_1...P_{\mu})^*A(P_1...P_{\mu})$ with one or more non-zero terms in the corresponding row. If the matrix $(P_1...P_{\mu})^*A(P_1...P_{\mu})$ has 1 or more non-zero off-diagonal terms $$\alpha_{ij} \ne 0$$ but all such $\alpha_{ii},\alpha_{jj}=0$ the change of variables $X_j=X_i'+X_j',X_t=X_t',t \ne j$ forces a non-zero diagonal term $\alpha_{ii}.$ This procedure terminates in no more than $2n$ steps.