I want to proof the following theorem:
With respect to any orthonormal basis, if the 2 $\times$ 2 matrix $\bigl(\begin{smallmatrix} a&b\\ c&d \end{smallmatrix} \bigr)$ represents a symmetric linear transformation, then $b = c$.
Now let $T: \mathbb{R}^2 \rightarrow \mathbb{R}^2$ a symmetric linear transformation and $A = \bigl(\begin{smallmatrix} a&b\\ c&d \end{smallmatrix} \bigr)$ the matrix of this transformation with respect to the orthonormal basis $B = \{\bigl(\begin{smallmatrix} v_{1}\\ v_{2} \end{smallmatrix} \bigr),\bigl(\begin{smallmatrix} w_{1}\\ w_{2} \end{smallmatrix} \bigr)\}$. We get following equations:
- $v_{1}^2 + v_{2}^2 = 1$, because of unit vector.
- $w_{1}^2 + w_{2}^2 = 1$, because of unit vector.
- $v_{1}w_{1} + v_{2}w_{2} = 0$, because $v$ and $w$ are prependicular.
- $aw_{1} + cw_{2} = bv_{1} + dv_{2}$, because $T(v) = \bigl(\begin{smallmatrix} a\\c \end{smallmatrix} \bigr)$, $T(w) = \bigl(\begin{smallmatrix} b\\d \end{smallmatrix} \bigr)$ and $\langle T(v),w\rangle = \langle v,T(w) \rangle$.
- $av_{1} + bv_{2} = a$, because $\bigl(\begin{smallmatrix} a\\c \end{smallmatrix} \bigr) = T(v) = Av = \bigl(\begin{smallmatrix} a&b\\ c&d \end{smallmatrix} \bigr) \bigl(\begin{smallmatrix} v_{1}\\ v_{2} \end{smallmatrix} \bigr)$
- $cv_{1} + dv_{2} = c$
- $aw_{1} + bw_{2} = b$, because $\bigl(\begin{smallmatrix} b\\d \end{smallmatrix} \bigr) = T(w) = Aw = \bigl(\begin{smallmatrix} a&b\\ c&d \end{smallmatrix} \bigr) \bigl(\begin{smallmatrix} w_{1}\\ w_{2} \end{smallmatrix} \bigr)$
- $cw_{1} + dw_{2} = d$
Now we take $6$ and $7$ and put them into $4$: $$b - bw_{2} + cw_{2} = bv_{1} + c - cv_{1}$$ $$b(1-v_{1}-w_{2}) = c(1-v_{1}-w_{2})$$
The only thing left is to show that $(1-v_{1}-w_{2}) \neq 0$. But how can I do this? Furthermore I didn't use $1$ and $2$.
Best regards!
Let's note $(;)$ the usual scalar product that we'll use. the canonical basis is orthonormal for this one.The proof stays true with respect to any orthonormal basis
Say u is the linear transformation associated to A, $u^*$ is its adjoint. Here for a symmetric linear transformation you have : $u = u^*$ by definition , while $u^*$ is defined as such :
$ (u(x);y) = (x;u^*(y)$ for every couple of vectors (x,y)
Using the notation you took for the matrix of u, you have , and considering $x=e_1 , y= e_2$ the two vectors of the canonical basis: $(e_1;e_2) = 0$
$u(e_1) = a*e_1 + c*e_2 ; u(e_2) = b*e_1 +d*e_2 $
=>$ (u(e_1);e_2 ) = c = (e_1;u(e_2)) = b$
So here goes the symmetry. That's really the reason why your transformation has this property