On a property of conditional expectation when the random variable is multivariate Normal.

69 Views Asked by At

The problem:

I am self studying Time Series: Theory and Methods by Brockwell and Davis.

Brockwell and Davis introduce conditional expectation in a geometrical way:

If $M$ is a closed subspace of $L^2$ containing the constant function, and if $X \in L^2$, the conditional expectation of $X$ given $M$ is the projection $P_MX$.

Where $P_MX$ is the unique element s.t. $P_M \in M$ and $$ ||X - P_M ||_{L^2}= \inf_{Y \in M} || X - Y ||_{L^2} $$

The existence and uniqueness of $P_M$ is given by the Projection Theorem on Hilbert spaces.

Notationally we write $E_MX = P_MX$. Also we define $M(Z_1, \dots, Z_n)$ to mean the closed subspace of $L^2$ consisting of all random variables in $L^2$ of the form $\phi(Z_1, \dots, Z_n)$ for some Borel function $\phi: R^n \rightarrow R$.

The Book then leaves as an exercise to the reader (Problem 2.20) to prove that

$$P_{\bar{sp} \{1, Z_1, \dots, Z_n \}}(X) = E_{M(Z_1, \dots, Z_n)}[X]$$

if $(X,Z_1, \dots, Z_n)$ has a multivariate Normal distribution.

My attempt:

Because $P_{\bar{sp} \{1, Z_1, \dots, Z_n \}}(X) \in \bar{sp} \{1, Z_1, \dots, Z_n \} $ we can write

$$P_{\bar{sp} \{1, Z_1, \dots, Z_n \}}(X) = \sum_{j =0}^n \alpha_j Z_j, \quad Z_0 =1 $$

we know also (from the prediction equations) that $\alpha_0, \dots, \alpha_n$ must satisfy

$$\left< \sum_{j =0}^n \alpha_j Z_j, Z_i \right> = \left< X, Z_i \right> \quad i= 0, 1, \dots, n$$

from these equations I can obtain $\alpha_1, \dots, \alpha_n$ in terms of the elements of the covariance matrix of the multivariate normal distribution. But I do not know how to proceed to prove the equality.