Conditional gaussians, particular calculation

42 Views Asked by At

I'm looking for confirmation that my solution to this problem is correct. The result seems unintuitive.

Given $\{X_i\}_{i = 1}^{10}$ $0$ mean jointly gaussian RVs with $\mathbb{E}X_iX_j = 2^{-|i - j|}$ calculate $\mathbb{E}X_5\mid X_4, X_3$

Using the formula $\mathbb{E}X\mid Y = \mu_X + \Sigma_{XY}\Sigma^{-1}_Y(Y - \mu_Y)$

I calculated that

$$\Sigma_{XY} = [1/2 \quad 1/4]$$

$$\Sigma_Y^{-1} = \begin{bmatrix} 4/3 & -2/3 \\ -2/3 & 4/3 \end{bmatrix}$$

Turning the crank yields $\mathbb{E}X_5\mid X_4, X_3 = \frac{1}{2}X_4$

Similarly, I've calculated $\mathbb{E}X_7\mid X_{10}, X_9, X_8 = \frac{1}{2}X_8$ and $\mathbb{E}X_6, X_9\mid X_7, X_8 = \frac{2}{3}[X_8 \; X_7]$

This does not appear correct to me, particularly the $\frac{2}{3}$ in the final result. But I don't think I've done any calculations incorrectly. If it's right, maybe explain the intuition?

1

There are 1 best solutions below

2
On BEST ANSWER

If $X=(X_6,X_9)^T$ and $Y=(X_7,X_8)^T$ then $\Sigma_{XY} = \begin{bmatrix} 1/2 & 1/4 \\ 1/4 & 1/2 \end{bmatrix}$ and $\Sigma_Y = \begin{bmatrix} 1 & 1/2 \\ 1/2 & 1 \end{bmatrix}$, so $\Sigma_{XY} \Sigma_Y^{-1} = \begin{bmatrix} 1/2 & 0 \\ 0 & 1/2 \end{bmatrix}$, so you're right in thinking $2/3$ is wrong.

Suppose $W_1 \sim N(0,1)$ and for each $n$ we have $W_{n+1}\mid W_n, \ldots, W_1 \sim N\left( \dfrac{W_n} 2, \dfrac 3 4 \right)$. This is a Markov chain. Then we have $\operatorname{E}(W_{n+1}) = \operatorname{E}(\operatorname{E}(W_{n+1} \mid W_n,\ldots,W_1)) = \operatorname{E}(W_n/2) = 0$ and \begin{align} \operatorname{var}W_{n+1} & = \operatorname{E}(\operatorname{var}(W_{n+1}\mid W_n,\ldots,W_1)) + \operatorname{var}(\operatorname{E}(W_{n+1}\mid W_n,\ldots,W_1)) \\[10pt] & = \operatorname{E}\left(\frac 3 4\right) + \operatorname{var}\left( \frac{W_n} 2 \right) = \frac 3 4 + \frac 1 4 = 1. \end{align} And: \begin{align} \operatorname{cov}(X_{n+1}, X_n) & = \operatorname{E}(X_{n+1}X_n) = \operatorname{E}(\operatorname{E}(X_{n+1}X_n\mid X_n)) = \operatorname{E}\left( \frac {X_n^2}2 \right) = \frac 1 2. \end{align} I think you can then show by induction that $\operatorname{cov}(X_{n+k},X_n) = 2^{-k}$.

So you've the random process you started with is actually a Markov chain, although that might not be obvious from the way it was defined.