Show diagonal covariance does not guarantee independence.

69 Views Asked by At

I was trying to show that if two variables have diagonal covariance, this does not necessarily guarantee their independence. For this, I was using an example where $x \sim U(-1,1)$ and $y=X^{2}$ to show this.

Here is my try:

$$ p\left(y | x\right)=\delta\left(y-x^{2}\right) $$

Now we would like to show that although the two variables are dependent, the covariance matrix between them is diagonal. $$ \operatorname{cov}[X, Y]=\mathrm{E}[X Y]-\mathrm{E}[X] \mathrm{E}[Y] $$

$$ p\left(x, y\right)=p\left(x\right) p\left(y | x\right) $$

\begin{align*} E[X Y]=& \iint x y p(x, y) d y d x \\ &= \iint x y p\left(x\right) p\left(y | x\right) d y d x \\ &= \int x p\left(x\right) \int y \delta\left(y-x^{2}\right) d y d x \\ % &= \int y p(y) d y \cdot \int x p(x) d x \\ % &= E[X] E[Y] \end{align*} Here is the deal: I was trying to show that all off-diagonal elements are in fact zero, but I'm stuck at that integral. Any suggestion on how to proceed? Thanks

1

There are 1 best solutions below

3
On BEST ANSWER

$E(XY)=E(X^3)=\frac{1}{2}\int_{-1}^1x^3dx=0$. Therefore the covariance $=0$, since $E(X)=0$.