Conditional expectation of the sum of random variables given their difference.

349 Views Asked by At

I am trying to solve the following problem. Suppose that $X$ and $Y$ are independent normal random variables, with $E(X)=E(Y)=0$, $E(X^2)=9$ and $E(Y^2)=216$. Also define $S=X+Y$ and $D=X-Y$. Compute $E(S|D)$ and $\operatorname{Var}(S|D)$. The problem appears in one of Jim Pitman's final exams for his undergraduate course in probability at UC Berkeley.

Here are a few thoughts. The random variables $S$ and $D$ are jointly Gaussian but not independent. Their covariance and correlation can be computed, as follows. Since $E(S)=E(D)=0$, we have that

$$\operatorname{Cov}(S,D)=E(SD)-E(S)E(D)=E(X^2-Y^2)-0=E(X^2)-E(Y^2)=9-16=-7.$$

We can build the covariance matrix $\Sigma$ for $S$ and $D$ (which have zero mean),

$$ \Sigma=\left[ \begin{array}{cc} E(D^2) & E(SD) \\ E(SD) & E(S^2) \end{array} \right], $$

and compute the joint density from here. At this point we can compute the conditional probability density $$ f_{S|D}(s|d) = \frac{f_{S,D}(s,d)}{f_D(d)} $$ where of course $f_D(d)$ is Normal$(0,25)$, and proceed from here to compute conditional variance and conditional expectation.

Do you think there is a shorter way to solve the problem?