How to find the projection onto a symmetric space?

211 Views Asked by At

A square matrix $A$ is symmetric if $A = A^T$ . Let $S$ be the subspace of symmetric matrices, inside the vector space of all $2 × 2$ matrices. Consider the matrix X = \begin{bmatrix} a & b \\ c & d\\ \end{bmatrix} where a, b, c, d are known constants.

A) Find the orthogonal projection of X onto subspace $S$

B) Find the orthogonal projection of X onto $S^\perp$

My attempt: I've never really projected into a matrix vector subspace, I've only ever done vectors to vectors, but the steps I think are probably the same.

Finding an orthogonal basis for the subspace of symmetric matrices: $$\begin{bmatrix} 1 & 0 \\ 0 & 0\\ \end{bmatrix}, \begin{bmatrix} 0 & 1 \\ 1 & 0\\ \end{bmatrix}, \begin{bmatrix} 0 & 0 \\ 0 & 1\\ \end{bmatrix}$$

Here's where things get confusing. I need to find how much of X is in the "direction" of each of my basis matrices, right?

So,

$$Proj(X) = \begin{bmatrix} 1 & 0 \\ 0 & 0\\ \end{bmatrix} X + \begin{bmatrix} 0 & \frac{1}{\sqrt(2)} \\ \frac{1}{\sqrt(2)} & 0\\ \end{bmatrix}X + \begin{bmatrix} 0 & 0 \\ 0 & 1\\ \end{bmatrix}X$$.

Where the second matrix is normalized so that it has "length" = 1, is this correct approach?


I actually computed it and it didn't seem to work. In the end I got $$proj(x) = \begin{bmatrix} a + \frac{c}{\sqrt(2)} & b + \frac{d}{\sqrt(2)} \\ \frac{a}{\sqrt(2)} + c & \frac{b}{\sqrt(2)}+ d\\ \end{bmatrix} $$

To get a symmetric matrix, I need the off diagonal terms to be equal. How can I go about finding this projection?


For part b, I'm not really sure how to go about finding $S^{\perp}$. If S were spanned by a set of vectors, then I would just need to find the nullspace of S and that would be $S^{\perp}$. How do I go about doing it in this situation where we have a space of matrices?