Diagonal matrix congruent to a symmetric complex matrix

338 Views Asked by At

Given the matrix: $$A=\begin{pmatrix}i&1\\1&-i\end{pmatrix}$$ Find a matrix $P$ such that $P^T A P$ is diagonal, how should I go about this? we know from Sylvester's theorem that $A$ is congruent to the matrix $D=\begin{pmatrix}1&0\\0&0\end{pmatrix}$ since it has rank $1$ and we're considering it over the complex field $\mathbb{C}$, however i'm not sure how can I go about this, if $A$ was real, since it is symmetric, I could consider the identity matrix and do row/column operations until I got A into a diagonal form, mimic such operations on the identity matrix and that should do the job, but in this case it doesn't work (unless I made some calculation mistake...)

Is there a general way to find the matrix which gives the congruence between a symmetric matrix and a diagonal matrix? thanks in advance...

3

There are 3 best solutions below

0
On BEST ANSWER

0) outline of general approach (may be skipped)
a very flexible way of dealing with problems involving congruence transforms is to consider your matrix as representing a symmetric bilinear form

i.e. abstractly having some vector space $V$ where $\mathbf v, \mathbf v'\in V$ may be related by

$\langle \mathbf v,\mathbf v'\rangle = \langle \mathbf v',\mathbf v\rangle = c \in \mathbb C$

where $\langle ,\rangle$ denotes some particular symmetric bilinear form, not an inner product per se

After introducing some basis we have $P\mathbf x = \mathbf v$ and $P\mathbf y = \mathbf v'$ and

The coordinate interpretation is
$\langle \mathbf v,\mathbf v'\rangle = \mathbf x^T P^TA P\mathbf y$

OP's problem amounts to selecting(/changing) the basis wisely so that $ P^TA P = D$ for some rank $k$ (k=1 in this problem) diagonal matrix $D$, preferably with all non-zero components on the unit circle.

In general the process consists of a modified Gram Schmidt-- here we can first figure out the dimension of the space of null vectors = r, then the subspace of non-null vectors $W$ has dimension $k=n-r$; in this particular problem $r=1=k$. A null-vector is a vector with that is orthogonal to every vector; orthogonal is defined as $\langle \mathbf v, \mathbf v'\rangle = 0$ (again this is not an inner product).

Working over $\mathbb C$ (in fact any field of characteristic $\neq 2$) we can easily find vectors that are not self-orthogonal and use this to run Gram Schmidt -- and in particular the normalization stage won't fail since we have not self-orthogonal vectors and in $\mathbb C$ we can always find square roots to normalize the 'length' with respect to the bilinear form to be 1. The computational test at this stage amounts to matrix vector multiplication with $\mathbf w\in W$ having coordinate vector $\mathbf z_1$ and collecting $n-r$ linearly independent vectors from $W$ in coordinate form in a matrix $Z$ and then computing $Z^T A\mathbf z$.

Artin's Algebra's chapter on Bilinear Forms has the details for the general approach for symmetric (and Hermitian) and skew bilinear forms.

1) easy answer for OP's specific problem
whenever $\text{rank}\big(A\big)=1$ for some symmetric $A$, focus building a basis for the kernel of $A$. For this particular problem:

$\mathbf p_2: = \begin{bmatrix}-1 \\ i \end{bmatrix}$
$A\mathbf p_2 =\mathbf 0$

select $\mathbf p_1$ to be linearly independent of $\mathbf p_2$ (e.g. a standard basis vector will do) and

$P := \bigg[\begin{array}{c|c} \mathbf p_1 & \mathbf p_2 \end{array}\bigg]$

$P^TAP = P^T(AP)= \bigg[\begin{array}{c|c} P^T(A\mathbf p_1) & P^T\mathbf 0 \end{array}\bigg]=\begin{bmatrix}\eta &0 \\ *&0 \end{bmatrix} =\begin{bmatrix}\eta &0 \\ 0&0 \end{bmatrix}$
for some $\eta \neq 0$

1.) we know $*=0$ by symmetry, that is $\big(P^TAP\big)^T = \big(P^TA^TP\big) = \big(P^TAP\big)$
2.) we also know $\eta \neq 0$ because $\text{rank}\big(P^TAP\big)=\text{rank}\big(A\big)=1$

from here we may effect one more congruence transform, this time using an elementary type 3 matrix so as to map $\eta \mapsto 1$.

0
On

yes, take one elementary matrix at a time on the right; if there are $r$ steps needed to finish the job, the final is $P = P_1 P_2 \ldots P_r.$ This time, just one is needed, $$ \left( \begin{array}{rr} 1 & 0 \\ i & 1 \\ \end{array} \right) \left( \begin{array}{rr} i & 1 \\ 1 & -i \\ \end{array} \right) \left( \begin{array}{rr} 1 & i \\ 0 & 1 \\ \end{array} \right) = \left( \begin{array}{rr} i & 0 \\ 0 & 0 \\ \end{array} \right) $$ There was no need to use anything other than the Gaussian integers here. For larger size matrices, it would still suffice to use a string of elementary matrices with elements in $\mathbb Q [i].$ The algorithmic description of this is at reference for linear algebra books that teach reverse Hermite method for symmetric matrices

If you really want $1$ as the $1,1$ element, you can multiply on both sides by a diagonal matrix, with 1,1 element being a chosen $\sqrt{1/i}.$ So, $\frac{1-i}{\sqrt 2}$ works.

If we ask about eigenvalues first, we find that the characteristic polynomial and the minimal polynomial are $\lambda^2,$ so that the Jordan form is not diagonal: $$ \left( \begin{array}{rr} 1 & 0 \\ i & 1 \\ \end{array} \right) \left( \begin{array}{rr} i & 1 \\ 1 & -i \\ \end{array} \right) \left( \begin{array}{rr} 1 & 0 \\ -i & 1 \\ \end{array} \right) = \left( \begin{array}{rr} 0 & 1 \\ 0 & 0 \\ \end{array} \right) $$

0
On

Consider this: $\mathrm{det}(P^TAP)=0$, since $\mathrm{det}A=0$. Thus one of diagonal elements of $P^TAP$ must be zero.

Let $P=\begin{bmatrix}p_1& p_2\\ p_3 &p_4\end{bmatrix} $ then $P^TAP=\begin{bmatrix}2p_1p_3+i(p_1^2-p_3^2)& p_1p_4+p_2p_3+i(p_1p_2+p_3p_4)\\ p_1p_4+p_2p_3+i(p_1p_2+p_3p_4) &2p_2p_4+i(p_2^2-p_4^2)\end{bmatrix}$.

Then to control your only diagonal element (lets say $d$) you can set off-diagonal elements to zero and $2p_2p_4+i(p_2^2-p_4^2)=0$ by choosing $p_2=p_4=0$ and $p_1=p_3=\sqrt{\frac{d}{2}}$.