
For clearness,I post picture. I cannot do this exercise in my lecture notes....
I have browsn the previous material,which focus on n-dimension subspace's sum of orthongonal projection.
Any hint or suggest are appreciate. Thank you!!

For clearness,I post picture. I cannot do this exercise in my lecture notes....
I have browsn the previous material,which focus on n-dimension subspace's sum of orthongonal projection.
Any hint or suggest are appreciate. Thank you!!
Copyright © 2021 JogjaFile Inc.
Let’s call your two bases $B_1$ and $B_2$ where the $i$th column is the $i$th basis vector. The matrix $P$ described is a change of basis matrix, but in this case it's strange because it goes on the right side $$B_1P=B_2$$ instead of the left.
Also like Arthur said, we need to assume that the two bases are orthonormal, because this means $B_1$ and $B_2$ are orthogonal and we have $$P=B^{-1}_1B_2=B^T_1B_2$$
The transpose of an orthogonal matrix is orthogonal and the product of two orthogonal matrices is orthogonal so $P$ is orthogonal.
As for $2$, consider $\sum (x\cdot \hat{e_j})\hat{e_j}$ and split the rightmost $\hat{e_j}$ according to the formula provided. Try to figure out what the coefficient of $e_i$ is in the resulting summation. You'll find that the answer is $x\cdot \sum_j P_{ij}\hat{e_j}$ and you can proceed from there to get the desired result.