Suppose I have a discrete matrix-valued random variable $X$, that is, I have defined a set of fixed matrices $\{Y_i\}_{i=1}^n$, and the random variable $X = Y_i$ with probability $\frac{1}{n}$. Is there any coherent theory for investigating the expectation and variance of this r.v. $X$?
It seems that a reasonable(?) definition of $\mathbb{E}[X]$ is
\begin{align*} \mathbb{E}[X] = \frac{1}{n} \sum_{i}^n Y_i \end{align*}
which produces a matrix as the expectation. But I have no idea how the variance should be interpreted. Should the usual definition be used?
\begin{align*} Var(X) = \mathbb{E}[(X-\mathbb{E}[X])^2] = \frac{1}{n}\sum_{i=1}^n (X - \mathbb{E}[X])^2 \end{align*}
What does the squared even mean in this case?
Searching produces a lot of literature of the statistics of random matrices whose individual entries are random variables, not I was not able to find anything on the situation outlined above. Any pointers will be greatly appreciated!
The variance is defined in terms of the transpose, i.e. say $X$ is a real-valued random variable in matrix form then its variance is given by $$ \mathrm{Var} (X) = \mathbb{E} \left[ (X - \mathbb{E}[X])(X - \mathbb{E}[X])^\top\right].$$ In your case this would results in $$\mathrm{Var} (X) = \frac1n \sum_{k=1}^n \left(X_k- \mathbb{E}[X] \right)\left(X_k - \mathbb{E}[X] \right)^\top$$ Hope this helps you.