Proving that two matrices are congruent, in order to establish a rank and signature result.

173 Views Asked by At

I'm reading Further Linear Algebra by by T. S. Blyth, E. F. Robertson. I'm somewhat stuck on Problem 9.14. In particular, they want us to establish that the quadratic form $$ Q(x_1,\ldots,x_n) = \sum_{r < s} (x_r - x_s)^2 $$ has both rank and signature equal to $n-1$. The hint they give us is to show that the matrix for $Q$ with respect to the standard basis has $n-1$ on the diagonal, and $-1$ elsewhere. Then they ask us to show that this matrix is congruent to the diagonal matrix with entries $n-1, n-2, \ldots, 2, 1, 0$. I'm assuming there is some clever basis transformation that gives us $P^t A P = \text{diag}(n-1, n-2, \ldots, 2, 1, 0)$, but I can't figure it out. Any help?

4

There are 4 best solutions below

0
On

Not seeing the pattern they want. You may use Sylvester's Law of Inertia for any diagonal outcome.

You may get all integers (if it is alright to use $P$ with determinant larger than $1$) by multiplying on both right and left by diagonal $(1,n-1, n-2, ...3,2,1)$

$$\left( \begin{array}{rrrr} 1 & 0 & 0 & 0 \\ \frac{ 1 }{ 3 } & 1 & 0 & 0 \\ \frac{ 1 }{ 2 } & \frac{ 1 }{ 2 } & 1 & 0 \\ 1 & 1 & 1 & 1 \\ \end{array} \right) \left( \begin{array}{rrrr} 3 & - 1 & - 1 & - 1 \\ - 1 & 3 & - 1 & - 1 \\ - 1 & - 1 & 3 & - 1 \\ - 1 & - 1 & - 1 & 3 \\ \end{array} \right) \left( \begin{array}{rrrr} 1 & \frac{ 1 }{ 3 } & \frac{ 1 }{ 2 } & 1 \\ 0 & 1 & \frac{ 1 }{ 2 } & 1 \\ 0 & 0 & 1 & 1 \\ 0 & 0 & 0 & 1 \\ \end{array} \right) = \left( \begin{array}{rrrr} 3 & 0 & 0 & 0 \\ 0 & \frac{ 8 }{ 3 } & 0 & 0 \\ 0 & 0 & 2 & 0 \\ 0 & 0 & 0 & 0 \\ \end{array} \right) $$

$$ P^T H P = D $$ $$\left( \begin{array}{rrrrr} 1 & 0 & 0 & 0 & 0 \\ \frac{ 1 }{ 4 } & 1 & 0 & 0 & 0 \\ \frac{ 1 }{ 3 } & \frac{ 1 }{ 3 } & 1 & 0 & 0 \\ \frac{ 1 }{ 2 } & \frac{ 1 }{ 2 } & \frac{ 1 }{ 2 } & 1 & 0 \\ 1 & 1 & 1 & 1 & 1 \\ \end{array} \right) \left( \begin{array}{rrrrr} 4 & - 1 & - 1 & - 1 & - 1 \\ - 1 & 4 & - 1 & - 1 & - 1 \\ - 1 & - 1 & 4 & - 1 & - 1 \\ - 1 & - 1 & - 1 & 4 & - 1 \\ - 1 & - 1 & - 1 & - 1 & 4 \\ \end{array} \right) \left( \begin{array}{rrrrr} 1 & \frac{ 1 }{ 4 } & \frac{ 1 }{ 3 } & \frac{ 1 }{ 2 } & 1 \\ 0 & 1 & \frac{ 1 }{ 3 } & \frac{ 1 }{ 2 } & 1 \\ 0 & 0 & 1 & \frac{ 1 }{ 2 } & 1 \\ 0 & 0 & 0 & 1 & 1 \\ 0 & 0 & 0 & 0 & 1 \\ \end{array} \right) = \left( \begin{array}{rrrrr} 4 & 0 & 0 & 0 & 0 \\ 0 & \frac{ 15 }{ 4 } & 0 & 0 & 0 \\ 0 & 0 & \frac{ 10 }{ 3 } & 0 & 0 \\ 0 & 0 & 0 & \frac{ 5 }{ 2 } & 0 \\ 0 & 0 & 0 & 0 & 0 \\ \end{array} \right) $$

$$ P^T H P = D $$ $$\left( \begin{array}{rrrrrr} 1 & 0 & 0 & 0 & 0 & 0 \\ \frac{ 1 }{ 5 } & 1 & 0 & 0 & 0 & 0 \\ \frac{ 1 }{ 4 } & \frac{ 1 }{ 4 } & 1 & 0 & 0 & 0 \\ \frac{ 1 }{ 3 } & \frac{ 1 }{ 3 } & \frac{ 1 }{ 3 } & 1 & 0 & 0 \\ \frac{ 1 }{ 2 } & \frac{ 1 }{ 2 } & \frac{ 1 }{ 2 } & \frac{ 1 }{ 2 } & 1 & 0 \\ 1 & 1 & 1 & 1 & 1 & 1 \\ \end{array} \right) \left( \begin{array}{rrrrrr} 5 & - 1 & - 1 & - 1 & - 1 & - 1 \\ - 1 & 5 & - 1 & - 1 & - 1 & - 1 \\ - 1 & - 1 & 5 & - 1 & - 1 & - 1 \\ - 1 & - 1 & - 1 & 5 & - 1 & - 1 \\ - 1 & - 1 & - 1 & - 1 & 5 & - 1 \\ - 1 & - 1 & - 1 & - 1 & - 1 & 5 \\ \end{array} \right) \left( \begin{array}{rrrrrr} 1 & \frac{ 1 }{ 5 } & \frac{ 1 }{ 4 } & \frac{ 1 }{ 3 } & \frac{ 1 }{ 2 } & 1 \\ 0 & 1 & \frac{ 1 }{ 4 } & \frac{ 1 }{ 3 } & \frac{ 1 }{ 2 } & 1 \\ 0 & 0 & 1 & \frac{ 1 }{ 3 } & \frac{ 1 }{ 2 } & 1 \\ 0 & 0 & 0 & 1 & \frac{ 1 }{ 2 } & 1 \\ 0 & 0 & 0 & 0 & 1 & 1 \\ 0 & 0 & 0 & 0 & 0 & 1 \\ \end{array} \right) = \left( \begin{array}{rrrrrr} 5 & 0 & 0 & 0 & 0 & 0 \\ 0 & \frac{ 24 }{ 5 } & 0 & 0 & 0 & 0 \\ 0 & 0 & \frac{ 9 }{ 2 } & 0 & 0 & 0 \\ 0 & 0 & 0 & 4 & 0 & 0 \\ 0 & 0 & 0 & 0 & 3 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 \\ \end{array} \right) $$

0
On

This is not an answer, just a suggestion on how to do it more easily.

The all-$1$'s matrix $J$ has eigenvalues: $n$ once, $0$ with multiplicity $(n-1)$.

Hence $nI-J$ has eigenvalues $0$ once and $n$ with multiplicity $(n-1)$.

So the rank and signature are both $n-1$.

0
On

The simplest approach is to note that for any $\mathbf x \in \mathbb R^n$
$Q\big(\mathbf x\big) = Q\big(x_1,\ldots,x_n\big) = \sum_{r < s} (x_r - x_s)^2\geq 0$
so this is a positive semi-definite form.

If we select $\mathbf x:= \mathbf 1$ we get
$Q\big(\mathbf 1\big)= \sum_{r < s} (0)^2=0$
so the ones vector is in the nullspace (this follows because the form is PSD).


alternerative justification: explicitly writing out the bilinear form
$\langle \mathbf x,\mathbf 1\rangle $
$= \frac{1}{2}\Big(q\big(\mathbf x + \mathbf 1\big) - q\big(\mathbf 1,\mathbf 1\big) - q\big(\mathbf x,\mathbf x\big)\Big) $
$= \frac{1}{2}\Big(q\big(\mathbf x + \mathbf 1\big) - 0 - q\big(\mathbf x,\mathbf x\big)\Big) $
$=\frac{1}{2}\Big\{\Big(\sum_{r < s} \big((x_r+1) - (x_s+1)\big)^2\Big)-\Big(\sum_{r < s} \big(x_r - x_s\big)^2\Big)\Big\}$
$=\frac{1}{2}\Big\{\Big(\sum_{r < s} \big(x_r - x_s\big)^2\Big)-\Big(\sum_{r < s} \big(x_r - x_s\big)^2\Big)\Big\}$
$=0$
where the choice of $\mathbf x$ is arbitrary $\implies \mathbf 1$ is a null vector for the bilinear form.


Now for any $\mathbf x$ that is linearly independent of $\mathbf 1$ i.e. for any $\mathbf x \not\propto \mathbf 1$ we have
$Q\big(\mathbf x\big) = \sum_{r < s} (x_r - x_s)^2\gt 0$
since there is at least some pairing $x_r \neq x_s$.

This proves the form's signature is $\big(n-1,0\big)$ -- and Sylvester's Law of Inertia tells you this holds irrespective of how you select your matrices.

0
On

Thanks all! Just by brute force algebra, I figured out that if you want to turn $\sum_{i=1}^n (n-1)x_i^2 - 2 \sum_{1 \leq i < j \leq n}x_ix_j$ into $(n-1)y_1^2+(n-2)y_2^2+\cdots + 1 \cdot y_{n-1}^2 + 0\cdot y_n^2$, then using $y_k = \sqrt{\frac{n}{n-k+1}}\left(x_k - \frac{x_{k+1}+\cdots+x_n}{n-k}\right)$ for $k = 1, \ldots, n-1$ and $y_n = x_n$, does the trick. Again, I don't know what the pedagogical purpose of this exercise was.