$E[\theta|x_1,x_2]=E[\theta|x_1]+E[\theta|x_2]$

67 Views Asked by At

How should one get this relation, i.e. $$E[\theta|x_1,x_2]=E[\theta|x_1]+E[\theta|x_2]$$ for $x_1$ and $x_2$ uncorrelated, jointly Gaussian vectors and $\theta$ is zero mean?

This relation is from "Fundamentals of statistical signal processing: Estimation theory" by Kay, on page $432$.

He just states the claim without any proof and I was wondering about how one could get it.

1

There are 1 best solutions below

1
On

From equation $(11.17)$ of the book, we have

$$\hat{\theta}=E(\theta|x)=E(\theta)+C_{\theta x}C_{xx}^{-1}(x-E(x))$$

Also, since $x_1$ and $x_2$ are independent,

\begin{align}\hat{\theta}&=E(\theta|x)=E(\theta)+C_{\theta x}C_{xx}^{-1}(x-E(x))\\ &= E(\theta)+C_{\theta x_1}C_{x_1x_1}^{-1}(x_1-E(x_1))+C_{\theta x_2}C_{x_2x_2}^{-1}(x_2-E(x_2))\\ &=E(\theta)+[E(\theta|x_1)-E(\theta)]+[E(\theta|x_2)-E(\theta)]\end{align}

Since $E(\theta)=0$, the result holds.

My working also discloses what happens when $E(\theta) \neq 0$.