Show that random variables $X$ and $Y$ are not independent, but nevertheless Cov$[X,Y] = 0$

3.4k Views Asked by At

Let $Z$ be a random uniformly distributed variable on $[0,1]$. Show that the random variables $X = \sin 2\pi Z$ and $Y = \cos 2\pi Z$ are not independent, but nevertheless Cov$[X,Y]=0$.

This is a homework assignment, but I'm a bit stuck.

My thoughts

We can see that $X$ and $Y$ are not independent, since both depend on $Z$. If we want to show this explicitly, then we need to show that $$f_{X,Y}(a,b) \neq f_X(a)\;f_Y(b),$$ where $f_{X,Y}(a,b)$ is the joint probability distribution function. But how can I find the (joint) probability distribution function(s) $f_X, f_Y$ and $f_{X,Y}$?

If I can find these functions, I can also solve the covariance problem. Is this the right way? Or is there a 'better' way to solve this problem?

5

There are 5 best solutions below

0
On BEST ANSWER

It is not necessary to find these functions.

To prove dependency it is enough to find sets $A,B$ such that $$P(X\in A\wedge Y\in B)\neq P(X\in A)P(Y\in B)$$

To prove that the covariance is $0$ it is enough to show that $$\mathbb EXY=\mathbb EX\mathbb EY$$

and for that you do not need the PDF's either.

E.g. note that: $$\mathbb EXY=\int_0^1\sin2\pi z\cos2\pi z~\mathrm dz$$

0
On

There is an easier way. It is sufficient to show that $P(X\in A,Y\in B)\neq P(X\in A)P(Y\in B)$ for some sets $A,B$. For example You can take $A=B=[0.9,1]$.

0
On

A routine integration gives you $E(X)=E(Y)=E(XY)=0$, so that $\mathrm{Cov}(X,Y)=0$. That is, $X$ and $Y$ are uncorrelated.

But $X$ and $Y$ are not independent since if a value of $X$ is known, then $Z$ is one of two possible values, which implies $Y$ is also one of two values. In other words, the conditional distribution of $Y\mid X$ is not the same as the distribution of $Y$.

0
On

Since the probability has been covered, I'm going to look at the covariance.

Observe that $$\text{Cov}(X, Y) = \mathbb{E}[XY]-\mathbb{E}[X]\mathbb{E}[Y]$$ so that $$\text{Cov}(X, Y) = \mathbb{E}[\sin(2\pi Z)\cos(2\pi Z)] - \mathbb{E}[\sin(2\pi Z)]\mathbb{E}[\cos(2\pi Z)]$$ Recall the trigonometric identity $$\sin(2\theta) = 2\sin(\theta)\cos(\theta)$$ Thus, $$\sin(4\pi Z) = 2\sin(2\pi Z)\cos(2\pi Z) \implies \dfrac{\sin(4\pi Z)}{2}=\sin(2\pi Z)\cos(2\pi Z)$$ hence the covariance is $$\begin{align}\text{Cov}(X, Y) &= \mathbb{E}\left[\dfrac{\sin(4\pi Z)}{2}\right] - \mathbb{E}[\sin(2\pi Z)]\mathbb{E}[\cos(2\pi Z)] \\ &= \dfrac{1}{2}\mathbb{E}\left[\sin(4\pi Z)\right]-\mathbb{E}[\sin(2\pi Z)]\mathbb{E}[\cos(2\pi Z)]\text{.} \end{align}$$ Since the density function $f_Z(z) = 1$ for $z \in [0, 1]$, we have $$\begin{align}\text{Cov}(X, Y) &= \dfrac{1}{2}\mathbb{E}\left[\sin(4\pi Z)\right]-\mathbb{E}[\sin(2\pi Z)]\mathbb{E}[\cos(2\pi Z)] \\ &= \dfrac{1}{2}\int_{0}^{1}\sin(4\pi z)\text{ d}z - \left[\int_{0}^{1}\sin(2\pi z)\text{ d}z \right]\left[\int_{0}^{1}\cos(2\pi z)\text{ d}z \right] \\ &= \dfrac{1}{2(4\pi)}\int_{0}^{4\pi}\sin(\theta)\text{ d}\theta - \dfrac{1}{(2\pi)^2}\left[\int_{0}^{2\pi}\sin(\theta)\text{ d}\theta \right]\left[\int_{0}^{2\pi}\cos(\theta)\text{ d}\theta \right] \tag{*}\\ &= \dfrac{-1}{8\pi}[\cos(4\pi)-\cos(0)]-\dfrac{-1}{4\pi^2}[\cos(2\pi)-\cos(0)][\sin(2\pi)-\sin(0)] \\ &= \dfrac{-1}{8\pi}(1-1)+\dfrac{1}{4\pi^2}(1-1)(0-0) \\ &= 0\text{.} \end{align}$$ In step $(*)$, I applied appropriate substitutions.

0
On

As I have started reading probability theory. I would like to put this on a more measure theoretic perspective. Feel free to point out any mistakes.

Let $(\Omega, \mathcal{F}, P)$ be our probability space, and $Z$ a nonnegative measurable function.

  1. Let $P_Z$ be the pushforward measure on $\mathbb{R}$ given by $P_Z(E) =P(Z^{-1}(E))$. Then by considering simple functions, then MCT, we have. $$ \int_{\Omega} Z dP = \int_{\mathbb{R}} dP_Z$$

As a corollary, in our case, $Z$ is uniform, hence $P_Z[0,t] = t$, and $P_Z$ conicides with the Lebesgue measure $\mu$ restricted to $[0,1]$. The proof is standard, and utilizes monotone class/dynkin's lemma.

We have $g_1, g_2: [0,1] \rightarrow \mathbb{R}$ Borel measurable functions. $g_1:= \sin 2 \pi x$, $g_2 := \sin 2\pi x$. What we need to show, as noted in other posts is $E(XY) =E(X)E(Y)$. $X=g_1(Z), Y=g_2(Z)$.

  1. If $h$ is an integrable, function, then as in 1. we can prove, $$ \int h(Z) dP = \int h dP_Z . $$

Let us compute $E(XY)$, we can do this similarly for $E(X)$ and $E(Y)$. $$ \int (g_1\cdot g_2)(Z) dP = \int_0^1 (g_1 \cdot g_2) dP_Z = \int_0^1 \sin 2 \pi z \cos 2 \pi z dz = 0 $$