What is the second moment for a symmetric set of vectors?

681 Views Asked by At

I am new to vector statistics and just wanted to check if I'm having a correct deduction here.

I have a set of vectors from an $N$-dimensional space $$ v_k=\begin{bmatrix} v_{k_1} \\ v_{k_2} \\ \vdots \\ v_{k_N} \end{bmatrix} $$ which elements are either $-1$ or $1$. If this set of vectors is a complete combination of all possible vectors, which count would be $2^N$ I know the first moment of these vectors would be $0$ because of symmetry, can I say the second moment, the variance-covariance matrix is equal to a unitary $N\times N$ matrix?

$$ \operatorname{cov}_{ij} = \frac{1}{N} \sum_{k=1}^{2^N} [(v_{k_i}-\mu_k)(v_{k_j}-\mu_k)] $$

where $\mu_k$ is the $k$th element of the first moment vector.

Is there a algebraic proof for this?

1

There are 1 best solutions below

0
On BEST ANSWER

One problem here is that you have not specified a distribution for the vectors --- you have said what the possible values of the elements are, but you have not specified their probabilities. Without giving a distributional form for the vector, it is not correct for you to say that the first moment (mean) is zero, and it is not possible to derive the variance-covariance matrix. (Also, the mean of a random vector is itself a vector, not a scalar, so your notation is confused.)

Perhaps what you mean when you mention the 'symmetry' here is that you intend for every possible outcome to have equal probability (in which case, you should really specify this explicitly). In this case the elements of the vector would be independent with equal probabilities of values $-1$ and $1$. This gives the distributional form $v_{i,j} \sim \text{IID } 2 \cdot \text{Bern}(\tfrac{1}{2}) -1$, which gives you the moments:

$$\boldsymbol{\mu} \equiv \mathbb{E}(\boldsymbol{v}_k) = \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \\ 0 \\ \end{bmatrix} = \boldsymbol{0} \quad \quad \quad \boldsymbol{\Sigma}_k \equiv \mathbb{V}(\boldsymbol{v}_k) = \mathbb{E}(\boldsymbol{v}_k \boldsymbol{v}_k^\text{T}) = \begin{bmatrix} 1 & 0 & \cdots & 0 & 0 \\ 0 & 1 & \cdots & 0 & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ 0 & 0 & \cdots & 1 & 0 \\ 0 & 0 & \cdots & 0 & 1 \\ \end{bmatrix} = \boldsymbol{I}.$$