Suppose we draw $X = [x_1,...,x_i,...x_r,...,x_t...,x_j,...x_N]$ from $N(0,\Sigma)$. is there any way to compute $E[x_ix_j]$ and how about $E[x_ix_jx_rx_t]$ ?
How can I compute the marginal probability $P(x_i,x_j)$?
I studied some information about this topic here: http://fourier.eng.hmc.edu/e161/lectures/gaussianprocess/node7.html
in the link when divide a vector $X=[x_1,x_2] \sim N(\mu,\Sigma)$ in to two parts we have $ \begin{bmatrix} \Sigma_{11}&\Sigma_{12}\\ \Sigma_{21}&\Sigma_{22}\\ \end{bmatrix} $ and $\begin{bmatrix} \mu{1}\\ \mu{2}\\ \end{bmatrix}$ and can create $P(x_1)$ by $P(x_1)=\int P(x_1,x_2)dx_2$.
but i did not get good views when the elements of $X$ is more than two. it is so difficult to compute $N-2$ integrals to compute $P(x_i,x_j)$ and then we can compute $E[x_ix_j]$. is there any simpler way to compute it?
Note that
$$ E[X_iX_j] = Cov[X_i, X_j] + E[X_i]E[X_j] = Cov[X_i, X_j] $$
as $E[X_i] = E[X_j] = 0$. So you just need to read the $(i, j)$ element in the covariance matrix as your desired answer.
Also, the random vector $(X_i, X_j)$ is an affine transformation of the multivariate normal random vector $(X_1, \ldots, X_n)$. Therefore it also follows a bivariate normal distribution, and thus you just need to extract the mean and covariance matrix of $(X_i, X_j)$ to completely characterize its distribution. Again this can be directly read from the given mean vector and covariance matrix of $(X_1, \ldots, X_n)$.