Concrete form of the projection matrix

59 Views Asked by At

Suppose we have a linear subspace $S = \operatorname{span}(w_1, \ldots, w_m) \subset \mathbb R^{n}$, where $w_1, \ldots, w_m$ form an orthonormal basis for $S$. Given $x \in \mathbb R^n$, denote by $W = (w_1, \ldots, w_m)$ the matrix whose column vectors are $w_i$, the projection of $x$ onto $S$ is given by $W^{T}W x$. I wonder how one proves this through the formula: $Px = \sum_{i} \langle w_i, x \rangle w_i$.

1

There are 1 best solutions below

4
On BEST ANSWER

So your premises are wrong here. The core thing is to remember orthonormality. W^T * W = I, as when i = j, <w_i, w_j> = 1, and when i != j, <w_i, w_j> = 1. This would imply that the projection of x here is just x, which isn't accurate in all cases, so your provided equation is wrong

What you are looking for instead is the least squares matrix solution, which is equivalent to a project onto a set of columns: x_hat = (W^T * W)^(-1)* W^T * x satisfies the least squares problem min_x_hat ||W * x_hat - x||^2, which would be a projection onto W

Note that the inverse term is just the identity due to that orthonormality properly. x_hat here is the best estimate you can get for x in the basis of W.

This gives us an estimate of x_hat = [<w_1, x>, ..., <w_m, x>]^T

To view x_hat in the standard basis, you take W * x_hat = Px as you defined above

You can also view this in terms of Gram Schmidt, a process that generates orthogonal vectors, which shows that for any subspace you can find an orthonormal basis, where the computation looks very similar to Px.