Having that $x$ and $y$ are two random variables with the covariance
$\operatorname{cov}(x,y) = E[(x - E(x))(y-E(y))] $
This means to me that
$\operatorname{cov}(x,y) = E[(y - E(y))(x-E(x))] $
which means
$\operatorname{cov}(x,y) = \operatorname{cov}(y,x)$ and that covariance is a commutative operator.
Having now that the covariance of a column vector $v = \begin{pmatrix} v_1\\ .\\ .\\ .\\ v_n\\ \end{pmatrix}$ is defined as $\operatorname{cov}(v) = E[(v - E(v)).(v - E(v))^T]$
which means
$ \operatorname{cov}(v) = E[( \begin{pmatrix} v_1\\ .\\ .\\ .\\ v_n\\ \end{pmatrix} - E(v)).\begin{pmatrix}\begin{pmatrix} v_1\\ .\\ .\\ .\\ v_n\\ \end{pmatrix}- E(v))\end{pmatrix} ^T]$
which means
$ \operatorname{cov}(v) = E[( \begin{pmatrix} v_1\\ .\\ .\\ .\\ v_n\\ \end{pmatrix} - E(v)).\begin{pmatrix} (v_1 - E(v_1)) . . . (v_n - E(v_n)) \end{pmatrix}]$
which means
$ \operatorname{cov}(v) = E[ \begin{pmatrix} v_1 - E(v_1)\\ .\\ .\\ .\\ v_n - E(v_n)\\ \end{pmatrix}.\begin{pmatrix} (v_1 - E(v_1)) . . . (v_n - E(v_n)) \end{pmatrix}]$
which means
$ \operatorname{cov}(v) = \begin{pmatrix} \operatorname{cov}(v_1,v_1) . . . \operatorname{cov}(v_1,v_n) \\ . \\ . \\ . \\ \operatorname{cov}(v_n,v_1) . . . \operatorname{cov}(v_n,v_n) \\ \end{pmatrix} $
Let's now call M the covariance matrix $\operatorname{cov}(v)$ of the column vector $v = \begin{pmatrix} v_1\\ .\\ .\\ .\\ v_n\\ \end{pmatrix}$
Can I now just say M is a symmetric n by n matrix such as the $(i,j)th$ element is the covariance of the variable $v_i$ and $v_j$ ?
Is this a solid proof for such a statement? Or did I forget something?
Covariance is a commutative operator.
A covariance matrix is always a symmetric $n$ by $n$ matrix.