transformation matrix between two sets of orthogonal basis vectors

105 Views Asked by At

Assume that $V$ is a real vector space with a scalar product, by the latter I mean a symmetric bilinear function $g$ such that the map $h: V \mapsto V^*$ given by $$h(v):=g(v, \cdot)$$ is an isomorphism of $V$ with its dual.

Now there is the notion of an orthonormal basis consisting of vectors $u_i (i=1\dots n)$ such that $$g(u_i,u_j)=0$$ for all $i \neq j$ and $$g(u_i,u_i)=\pm 1$$

Now my question: In the case of an ordinary inner product (that is a positive definite symmetric bilinear form) there is the general result that a linear mapping from one orthonormal basis to another has a matrix representation, wherein the matrix has unit length columns which are orthogonal. Does a similar result hold here? If not, is there any way to deduce the weaker notion that the determinant of such a linear map needs to equal $\pm 1$?

1

There are 1 best solutions below

1
On

Yes, the determinant needs to be equal $\pm 1$.

Assuming the dimension is finite, a nondegenerate symmetric bilinear form is given by $$(u, v) \mapsto u^\top B v$$

where $u, v$ are column vectors and $B$ is a nonsingular symmetric matrix. Now if $x_1, \dots x_n$ are a pseudo-orthonormal basis, put the $x_i$ as columns of a matrix $X$ which must satisfy $$ X^\top B X = \text{diag}(I_p, -I_q)$$ where $\text{diag}$ means block diagonal, $I_x$ are identity matrices of dimension $x$ and $p+q = \dim{V}$.

Now let $S$ be a nonsingular matrix that transforms the basis $X$ to $S X$. Then you want $SX$ to satisfy $$ X^\top S^\top B S X = \text{diag}(I_p, -I_q)$$ Since $X$ must be invertible, $S$ must then satisfy $S^\top B S = B$. Then $$\det{B} = \det(S^\top B S) = \det{S^\top} \det{B} \det{S} = (\det{S})^2 \det{B}$$ So $(\det{S})^2 = 1$.