Showing that a r.v. is uncorrelated to a function of two other, mutually independent r.vs.

40 Views Asked by At

Consider three complex random variables, $g$, $x$, and $w$. I know that $g$ and $x$ are mutually independent, $x$ and $w$ are uncorrelated ($\mathbb E[xw^*] = 0$), but nothing about the relation between $g$ and $w$. $x$ and $w$ have zero mean.

I'm having problem showing

$\mathbb E[(g-\mathbb E[g])xw^{*}]=0$.

I usually like to use the law of total expectation to condition on one of the variables, but I can't seem to calculate the resulting conditional expectation. I'm sure I'm missing something obvious, but I can't see what.

One example of where I get stuck is

$\mathbb E[(g-\mathbb E[g])xw^{*}] = \mathbb E_g[ \mathbb E[(g-\mathbb E[g])xw^{*}\vert g]] = \mathbb E_g[ (g-\mathbb E[g])\mathbb E[xw^{*}\vert g]].$

I don't know how the conditioning on $g$ affects the expectation of the product $xw^*$. No matter how I try to solve it, I end up with the same problem. What am I missing?

Edit

Based on Michael's comment, I might have misunderstood something. In the book I'm reading, the following is said:

  • $\mathbb E [w]=\mathbb E [x]=0$

  • $x$ and $w$ are uncorrelated, not necessarily independent

  • $g$ and $x$ are independent

  • No assumption is made in the statistical relation between $g$ and $w$

A direct calculation shows that the second and third terms in $\mathbb E [g]x + (g-\mathbb E[g])x + w$ are mutually uncorrelated, and uncorrelated with $x$.

I'm sorry if I posed the incorrect problem, there must be something I'm not getting.

1

There are 1 best solutions below

1
On BEST ANSWER

You were correct in your original question. It looks like the book is wrong. The book seems to be claiming that $(G-E[G])X$ and $W$ are uncorrelated. To prove that, since both have zero mean, it suffices to show: $$ E[(G-E[G])XW]=0 $$ which is what you were trying to show (I am assuming the random variables are real for simplicity). This reduces to showing $E[GXW]=0$, which is not true in general.

Here is a specific counter-example:

Define $X$ and $G$ as i.i.d. with $P[X=1]=P[X=-1]=1/2$. Define $W=XG$. Then $X, W, G$ are pairwise i.i.d. (for example, compute $P[W=1|X=1]$ and $P[W=1|X=-1]$). But $WXG=W^2 = 1$. So $E[WXG]=1$.