I have seen an expression for the covariance of not of just two random variables but the whole covariance matrix $\mathbf{K}$. $$\mathbf{K} = E\bigg((\mathbf{x} - E(\mathbf{x}))(\mathbf{x} - E(\mathbf{x}))^{T}\bigg) $$ where $\mathbf{x}$ is a vector of random variables $(\mathbf{x_{1}}, .., \mathbf{x_{n}})$.
Presummably, in a more statistical sense the same is true for averages : let's denote them by $\langle \square \rangle = \frac{1}{n} \sum_{i=1}^{n}\big(\square\big)$. $$\mathbf{K} = \bigg\langle(\mathbf{x} - \langle\mathbf{x}\rangle)(\mathbf{x} - \langle\mathbf{x}\rangle)^{T}\bigg\rangle $$ where $\mathbf{x}$ is a vector of vectors $(\mathbf{x_{1}}, .., \mathbf{x_{n}})$, e.g. $\mathbf{x_{1}} = \begin{bmatrix}1 \\ 2 \\ 3\end{bmatrix}$.
Can this be expanded to see that individual elements of $\mathbf{K}$ indeed make up covariates?
Instead of calculating each individual covariate (computationally) to build $\mathbf{K}$ it would be nice to have a single analytical expression.
The covariance between, say $x_j, x_k$ is given by \begin{align} \sigma_{jk} &= \text{Cov}(x_j, x_k) \\ &= \mathbb{E}[(x_j-\mu_j)(x_k-\mu_k)]\\ &= \mathbb{E}(x_j x_k)-\mu_j \mu_k \\ \end{align} Clearly, when $k=j$ we obtain the variance. $$\sigma_{jj} = \mathbb{E}[(x_j-\mu_j)^2]$$ For $k$-variables, set the covariance matrix as $$ \bf{\Sigma} = (\sigma_{ij})=\begin{bmatrix} \sigma_{11} & \sigma_{12} & \sigma_{13} & \dots & \sigma_{1j} \\ \sigma_{21} & \sigma_{22} & \sigma_{23} & \dots & \sigma_{2j} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \sigma_{i1} & \sigma_{i2} & \sigma_{i3} & \dots & \sigma_{ij} \end{bmatrix} $$ Clearly as in the above definition of the variance, the diagonal entries of the above matrix are found by composing $(x_i-\mu_i)(x_j-\mu_j)^T$. It is left to you to see that the off diagonal elements of this matrix follows the same pattern.