Why do the columns of the inverse of a matrix (defined as a linear operator) form an orthogonal basis in an inner product space?

78 Views Asked by At

Let V be a vector space over C and W be an inner product space over C with inner product <., .> and T:V --> W be a linear transformation. Find an orthogonal basis for V = R^3 with the inner product <., .>' = < T(u), T(v)> , if T= (Left multiplication by A) where A = a 3x3 matrix

I know the solution is finding the inverse of A and taking its columns as the orthogonal basis but I am not quite sure why

1

There are 1 best solutions below

3
On

The question is asking you to find some matrix $A$ that acts like the linear transformation $T : V \to W$, or in other words, given $v \in V$ and $w \in W$, we want to find some matrix $A$ such that $Av = w$.

Furthermore, we are equipping $W$ with an inner product, so we can compute $\langle w_1, w_2\rangle_W$. Note that I am using subscript $W$ around the inner product brackets to be explicitly clear which space I am computing the inner product over.

Now, we want our matrix $A$ to have the following properties: it is $3 \times 3$ and that it forms a unitary matrix. If we assume $V,W$ to be vector spaces over the reals, then we sometimes call the matrix orthogonal. In any case, we will ensure that its rows are orthogonal vectors.

Let's take a moment to ask "why rows, why not columns?" Intuitively, recall that when multiplying by a column vector, we're essentially taking the inner product of each row of the matrix with the vector. In fact, it won't matter, because unitary matrices enjoy the property that their rows and columns are orthogonal (in fact orthonormal) with respect to the usual inner product.

Now, since we have $\langle u, v\rangle_V = \langle T(u), T(v)\rangle_W$, it is quite simple to find such a matrix. In fact, it is the most elementary matrix you can find!