Let $V$ be a vector space over $\mathbb{C}$, $\dim V<\infty$ and $\langle-,-\rangle $ positive definite. $T: V\rightarrow V$ linear and self-adjoint, show that:
$\|T(v)-iv\|^2=\|T(v)\|^2+\|v\|^2$
My atempt:
First, note that $\forall v \in V,\ \ \langle T(v),v \rangle \in \mathbb{R}$ since: \begin{equation} \langle T(v),v \rangle=\langle v,T^*(v) \rangle=\langle v,T(v) \rangle=\overline{\langle T(v),v\rangle} \end{equation}
Now,
$\langle T(v)-iv,T(v)-iv \rangle = \langle T(v),T(v)-iv \rangle + \langle -iv,T(v)-iv \rangle$
$=\overline{\langle T(v)-iv,T(v) \rangle}+ \overline{\langle T(v)-iv,-iv \rangle}$ $=\overline{\langle T(v),T(v) \rangle}+ \overline{\langle -iv,T(v) \rangle}$
$+\overline{\langle T(v),-iv \rangle}+ \overline{\langle -iv,-iv \rangle}$
So I'm done if I can show that \begin{equation} \overline{\langle -iv,T(v) \rangle } + \overline{\langle T(v),-iv \rangle }=0 \end{equation} but I don't see why this is true. Or, am I totally of the road saying crazy things??? Any help will be appreciated.
Thanks.
You're so close. Using self-adjoint and factoring out the $i$'s, $$\overline{\langle -iv,T(v) \rangle} + \overline{\langle T(v),-iv \rangle} = i \cdot \langle T(v),v \rangle -i \cdot \langle T(v),v \rangle =0 $$