$X_1,...,X_n,X_{n+1}$ are jointly distributed with a Gaussian distribution. We let $X^* = E[X_{n+1}|X_1,...,X_n]$. Show that there exists constants $a_1,...,a_n,a_{n+1}$ such that $X^* = a_0+\Sigma_{k=1}^n a_kX_k$.
Additional info:
$X_{n+1} - X^*$ is indep. of $X_1,...,X_n$ and $X^*$ satisfies the system of equation:
$E[X_{n+1}-X^*] = 0$
$E[(X_{n+1}-X^*)X_k] = 0, k = 1,...,n$
There is a suggestion to work backwards and argue that there exist $a_k$ solving the last two equations; show $X_{n+1} − X^*$ is independent of G = $\sigma(X_1, . . . , X_n)$; and consider
$\int_G X_{n+1} dP = \int_G(X^* + X_{n+1} - X^*)dp$
Attempt:
Suppose n = 1. Then the conditional expectation of $E[X_1|X_2]$ is almost surely uniquely defined as that Borel function of $X_2$ for which $E[(X_1 - E[X_1|X_2])g(X_2)] = 0$ for all Borel functions g. In the jointly Gaussian case, it suffices to verify that there is an affine combination $a_0 + \Sigma_{k=1}^n a_kX_k $ such that $X - (a_0 + \Sigma_{k=1}^n a_kX_k)$ is uncorrelated with the random variable $X_2$ and has zero mean. This is because, since $(X - (a_0 + \Sigma_{k=1}^n a_kY_k),X_2)$ is a linear transformation of $(X_1,X_2)$, these variables are jointly Gaussian and so this uncorrelatedness would imply that $X - (a_0 + \Sigma_{k=1}^n a_kX_K)\scriptstyle\coprod$ $X_2$, which implies that for all Borel functions g:
$E[(X - (a_0 + \Sigma_{k=1}^n a_kX_k))g(X_1,...,X_n)] = E[X-(a_0 + \Sigma_{k=1}^n a_kX_k)]E[g(X_1,...,X_n)] = 0$
where the second line used $E[X-(a_0+\Sigma_{k=1}^na_kX_k)]$ = 0. This then implies $E[X_1|X_2] = a_0+\Sigma_{k=1}^na_kX_k$ by definition of the conditional expectation. Writing down the equals corresponding to this uncorrelatedness and the equation $E[E[X_1|X_2]] = E[X_1]$ gives a collection of simultaneous linear equations that can be solved for the coefficients $a_0,a_1,...,a_n$.
Hints: