Covariance Matrix of Dataset

389 Views Asked by At

I was looking at the following question:

Why does this covariance matrix have additional symmetry along the anti-diagonals?

and I realized that I don't really understand the meaning of covariance matrices (or even variance). In the case presented in the question, it is originally posed in the book that the diagonal of the covariance matrix is populated with $1$'s.

I understand from the derivation and given relationship between $z_1$ and $z_2$ that mathematically it doesn't work out to be the case, but I can't figure out intuitively why this should be the case.

As I understand it, the covariance between a random variable and itself is equivalent to its variance, but we also have the definition from Wikipedia: "covariance is a measure of how much two random variables change together."

It seems reasonable to conclude from the definition that a random variable varies perfectly with itself, which would imply cov$(x,x)=1$. But it's clearly the case that for arbitrary random variable $x$, var$(x)$ is not necessarily $1$.

I think my lack of understanding comes from what it means for two RV to change together. Any points as to where my intuition falls short? This is a very basic probability theory question, so please bear with me.