Diagonalizing a bilinear form

799 Views Asked by At

Question

a.

we mark $\mathbb{R}_2[x]$ as the polynomial space of degree $ \le 2$ over the real field $\mathbb{R}$. $\xi :\mathbb{R}_2[x] \times \mathbb{R}_2[x] \to \mathbb{R}$ $$\xi(q,p) = q(-1)p(-1)+q(0)p(0)+q(1)p(1)$$

Find a base where $\xi$ matrix is diagonal.

b.

$\xi_A :\mathbb{R}_3[x] \times \mathbb{R}_3[x] \to \mathbb{R}$

whereas $$ A = \left( \begin{array}{ccc} 1 & -2 & 3 \\ -2 & 6 & -9 \\ 3 & -9 & 4 \end{array} \right) $$

Find a base where A is diagonal.

Methods

we were taught a method for building orthogonal basis, but it's $not$ the Gram-Schmidt. Do you know it? I can barely find examples using this method. It somewhat like this:

Find a vector v such that $\xi(v,v) \neq 0$. Find the sub-space that orthogonal to span{v}. We mark it as $L \bot$. Now we have a direct sum of two sub-spaces.

Take $L \bot$ and repeat process:

Find a vector $u \in L \bot$ such that $\xi(u,u) \neq 0$ We are sure there's a vector like this due to the following:

assuming $\xi \ne 0$ (If not so we're done with the whole method anyway) Therefore, it means that exists vectors x,y such that $\xi(x,y) \neq 0$

if $\xi(x,x) \neq 0$ then mark u=x if $\xi(y,y) \neq 0$ then mark u=y otherwise $\xi(x+y,x+y) = \xi(x,x)+ \xi(x,y) + \xi(y,x)+\xi(y,y)$ and because we know $\xi(x,x) = 0$ and $\xi(y,y) = 0$ we get that $\xi(x+y,x+y) = 2\xi(x,y) \neq 0$ (using symmetry)

we mark u=x+y.

know something about it?

please, any help would be appreciated!

2

There are 2 best solutions below

0
On

Here is an algorithm, using an augmented matrix: \begin{eqnarray} &&\begin{bmatrix} 1 & -2 & 3 & : & 1 &0 &0\\ -2 & 6 & -9&:&0&1&0 \\ 3 & -9 & 4&:&0&0&1 \end{bmatrix}\\ &2C_1+C_2&\begin{bmatrix} 1 & 0 & 3 & : & 1 &0 &0\\ -2 & 2 & -9&:&0&1&0 \\ 3 & -3 & 4&:&0&0&1 \end{bmatrix}\\ &2R_1+R_2&\begin{bmatrix} 1 & 0 & 3 & : & 1 &0 &0\\ 0 & 2 & -3&:&2&1&0 \\ 3 & -3 & 4&:&0&0&1 \end{bmatrix}\\ &-3C_1+C_3&\begin{bmatrix} 1 & 0 & 0 & : & 1 &0 &0\\ 0 & 2 & -3&:&2&1&0 \\ 3 & -3 & -5&:&0&0&1 \end{bmatrix}\\ &-3R_1+R_3&\begin{bmatrix} 1 & 0 & 0 & : & 1 &0 &0\\ 0 & 2 & -3&:&2&1&0 \\ 0 & -3 & -5&:&-3&0&1 \end{bmatrix}\\ &3/2C_2+C_3&\begin{bmatrix} 1 & 0 & 0 & : & 1 &0 &0\\ 0 & 2 & 0&:&2&1&0 \\ 0 & -3 & -19/2&:&-3&0&1 \end{bmatrix}\\ &3/2R_2+R_3&\begin{bmatrix} 1 & 0 & 0 & : & 1 &0 &0\\ 0 & 2 & 0&:&2&1&0 \\ 0 & 0 & -19/2&:&0&3/2&1 \end{bmatrix}\\ \end{eqnarray} So then the matrix on the left is a diagonal matrix $D$, and the matrix on the right is a matrix $Q^T$ so that $Q^TAQ=D$. As you can see when using this method you simply perform an elementary column operation, followed by the exact same row operation.

Just to note that this is specifically to diagonalize a symmetric bilinear form - so we are talking about the congruence relation here. $Q$ does represent a change of basis matrix, but $QQ^T \neq I$...so don't confuse this with diagonalization of matrices in the similarity relation. (read more here)

1
On

Not an answer, just an attempt to understand the question, but too long for a comment.

"Find a vector $v$ such that $\xi(v,v)\ne0$." OK, let $v=(1,0,0)$, then $$\xi(v,v)=v^tAv=1\ne0$$

"Find the subspace orthogonal to the span of $\{v\}$." OK, that's all the vectors of the form $(0,r,s)$. "We mark it as $L^{\perp}$." OK, $$L^{\perp}=\{\,(0,r,s):r,s\in{\bf R}\,\}$$

"Find a vector $u$ in $L^{\perp}$ such that $\xi(u,u)\ne0$." OK, $u=(0,1,0)$ is in $L^{\perp}$, and $\xi(u,u)=6\ne0$.

Now presumably one repeats the process and gets a vector like $w=(0,0,1)$ which is orthogonal to both $u$ and $v$ and which satisfies $\xi(w,w)\ne0$ (in fact, $\xi(w,w)=4$).

So, we have our orthogonal basis $\{\,u,v,w\,\}$, but I don't see what this has to do with making $A$ diagonal. So, where along the way have I done something that wasn't what you intended?