Prove that a projection matrix for a subspace is the basis matrix of the subspace times its transpose.

575 Views Asked by At

Let V be a real linear subspace and U be a matrix whose columns form an orthonormal basis for V. How can I prove that $$proj_v(x) = UU^tx$$

I've been struggling for linear algebra, so specific, concrete, and minimally jargon filled answers would be appreciated :) but anything is helpful obviously.

2

There are 2 best solutions below

0
On

Let $u_1, \ldots, u_k$ be the columns of $U$. Let $v_{k+1}, \ldots, v_n$ vectors such that $\{u_1, \ldots, u_k, v_{k+1}, \ldots, v_n\}$ be a basis for $\mathbb{R}^n$ such that $u_i^T v_j = 0$.

Now for any $x$, we have that there exists $c_1, \ldots, c_n$ such that $x = \sum_{i=1}^kc_iu_i + \sum_{i=k+1}^nc_i v_i$.

Now note that $U^T = \begin{bmatrix} u_1^T \\ \vdots \\ u_k^T\end{bmatrix}$. So if we let $y = U^Tx$, we have that the $j$th coordinate of $y$ is given by

$$ y_j = \sum_{i=1}^kc_iu_j^Tu_i + \sum_{i=k+1}^nc_i u_j^Tv_i. = c_j$$

Where the last equality is due to the vectors being orthogonal. Thus, we have that $y^T = [c_1, \ldots c_k]$. Then finally we have that

$$Uy = \begin{bmatrix} u_1 \ldots u_k\end{bmatrix} \begin{bmatrix} c_1 \\ \vdots \\ c_k \end{bmatrix} = \sum_{i=1}^kc_iu_i.$$

0
On

Another cute and maybe simpler proof using convex optimization.

Let $y \in \mathbb{R}^n$ be the vector to be projected on $V$ (whose dimension we denote by $k$). That is, we want to find the closest vector in $V$ to $y$, i.e., we are seeking to solve $$ \arg\min_{x\in\mathbb{R}^k} ||Ux-y||^2. $$ Differentiating with respect to $x$ gives $$ \frac{\partial}{\partial x} ||Ux-y||^2 = \frac{\partial}{\partial x} (Ux-y)^T(Ux-y) = \frac{\partial}{\partial x} (x^TU^TUx -2x^TUy + y^Ty) = 2U^TUx - 2U^Ty\overset{!}{=}0,$$ and thus the minimizer is $x=(U^TU)^{-1}U^Ty.$ The projected vector in $V$ is $Ux=U(U^TU)^{-1}U^Ty = UU^Ty$, where due to the orthonormality of $U$, it holds that $U^TU=I$. This also gives you also a formula for non-orthonormal bases.