Expected values of dependent Gaussian variables

112 Views Asked by At

Let $(X_1,X_2,X_3)$ be a set of three zero-mean Gaussian random variables with a covariance matrix of

$$ C=\sigma^2 \begin{bmatrix} 1 & \rho & \rho \\ \rho & 1 & \rho \\ \rho & \rho & 1 \end{bmatrix} $$ How can I find the following expected values

$E[X_1\mid X_2=x_2,X_3=x_3],\,E[X_1X_2\mid X_3=x_3]$ and $E[X_1X_2X_3]$?

This is a probloem of my homework, I have only a not-so-good way to deal with it.

Take the first question for example,I try to find the conditional PDF of $f_{X_1|X_2,X_3(x_1|x_2,x_3)} $, and then directly get the answer by finding the means, but it seems to complex...

2

There are 2 best solutions below

0
On BEST ANSWER

A good thing about (zero-mean) multivariate normal distribution is that its structure is compatible with the inner-product structure arising from its covariance structure:

\begin{equation} \begin{array}{ccc} \text{zero-mean gaussian distribution} & & \text{inner product space} \\ \hline \text{covaraince} & \leftrightarrow & \text{inner product} \\ \text{independence} & \leftrightarrow & \text{orthogonality} \\ \text{conditioning} & \leftrightarrow & \text{orthogonal projection} \end{array} \end{equation}

For example, it turns out that

$$ \mathbf{E}[X_1 \,|\, X_2, X_3] = \alpha_2 X_2 + \alpha_3 X_3 $$

for some constants $\alpha_2, \alpha_3$, as if the 'vector' $X_1$ is projected onto the liner span of $X_2$ and $X_3$, and all you have to do is to determine them so that $Y = \alpha_2 X_2 + \alpha_3 X_3$ solves

$$\operatorname{Cov}(X_1 - Y, X_2) = \operatorname{Cov}(X_1 - Y, X_3) = 0, \tag{*}$$

i.e., $X_1 - Y$ is 'orthogonal' to the span of $X_2$ and $X_3$. In random variable's side, once $\alpha_2, \alpha_3$ satisfy the above relation, then $X_1 - Y$ is uncorrelated to $(X_2, X_3)$, which then implies that they are independent thanks to joint normality. So we indeed have

$$\mathbf{E}[X_1 \,|\, X_2, X_3] = \mathbf{E}[(X_1 - Y) + Y \,|\, X_2, X_3] = \mathbf{E}[X_1 - Y] + Y = Y. $$

Going back to solving $\text{(*)}$, this can be done in a systematical way, and yields the formula in @Gabriel's link. Anyway, the above equation reduces to

$$ \begin{bmatrix} \rho \\ \rho \end{bmatrix} = \begin{bmatrix} 1 & \rho \\ \rho & 1 \end{bmatrix} \begin{bmatrix} \alpha_2 \\ \alpha_3 \end{bmatrix} \qquad\Rightarrow\qquad \begin{bmatrix} \alpha_2 \\ \alpha_3 \end{bmatrix} = \frac{1}{1+\rho} \begin{bmatrix} \rho \\ \rho \end{bmatrix} $$

and so,

$$ \mathbf{E}[X_1 \,|\, X_2, X_3] = \frac{\rho}{1+\rho} X_2 + \frac{\rho}{1+\rho} X_3 $$

For the second part, similar idea shows that $X_i - \rho X_3$ and $X_3$ are independent for each $i = 1, 2$, and so,

\begin{align*} \mathbf{E}[X_1 X_2 \,|\, X_3] &= \mathbf{E}[((X_1 - \rho X_3) + \rho X_3)((X_2 - \rho X_3) + \rho X_3) \,|\, X_3] \\ &= \mathbf{E}[(X_1 - \rho X_3)(X_2 - \rho X_3)] + \rho^2 X_3^2 \\ &= \sigma^2 \rho(1 - \rho) + \rho^2 X_3^2 \end{align*}

For the final problem, we may utilize the above result to compute

$$ \mathbf{E}[X_1 X_2 X_3] = \mathbf{E}[\mathbf{E}[X_1 X_2 \,|\, X_3] X_3] = \mathbf{E}[(\sigma^2 \rho(1 - \rho) + \rho^2 X_3^2)X_3] = 0, $$

although the same conclusion follows from a more basic observation that the expectation of any odd-degree monomial in $X_i$'s is zero by symmetry.

0
On

Does this figure of the distribution help you?

For your first problem integrate along the red line.

enter image description here