EDIT: this forum post was helpful for my understanding as to why José' Answer is correct. Further clarification about this general problem can be found in the comments section.
beginner here.
I need to calculate the rank of the following linear transformation:
$\sigma: M(2, \mathbb{R}) \rightarrow M(2, \mathbb{R}), A \rightarrow \frac{1}{2}(A + A^{T})$
Progress so far:
It seems that this question consists of two parts. (1) find a matrix that describes this mapping, (2) find the rank of that matrix, just like I would any other matrix.
(1):
$ \sigma(\begin{pmatrix} a & b \\ c & d \end{pmatrix}) \rightarrow \frac{1}{2} \begin{pmatrix} a + a & b + c \\ c + b & d + d \end{pmatrix} = \begin{pmatrix} a & \frac{b + c}{2} \\ \frac{b + c}{2} & d \end{pmatrix} $
At this point however I become stuck. Is my foundation correct, or do i need to approach this question differently?
P.s So far I have considered these sources:
1) A question from math.stackexchange defining relevant terms.
2) An apparently related question considering a mapping from R_2 to R_3.
But I am still having trouble internalizing the methodology. Thanks in advance.
What you did proves that every matrix from $\operatorname{range}(\sigma)$ is a $2\times2$ symmetric matrix. Furthermore, if $M$ is such a matrix, then $\sigma(M)=M$. Therefore, $\operatorname{range}(\sigma)$ is the space of all $2\times2$ symmetric matrices. So, $\operatorname{rank}\sigma=3$.