Expectation of a function of two independent random variables.

57 Views Asked by At

Suppose X and Y are independent and uniformally distributed around [0, 1]. Define $Z = (X-Y)^2$. I'm interested in the variance $\sigma_Z^2 = \mathbb{E}[Z^2] - \mathbb{E}[Z]^2$, and suppose I already know $\mathbb{E}[Z]^2 = \frac{1}{36}$.

For $\mathbb{E}[Z^4]$, is computing $\mathbb{E}[Z^2] \cdot \mathbb{E}[Z^2]$ sufficient enough since X and Y are i.i.d?

1

There are 1 best solutions below

4
On BEST ANSWER

In general the answer to "is $\mathbb{E}(Z^4)=\mathbb{E}(Z^2)^2$?" is no, so I would not try to use this. Just directly compute everything.

Assuming you want to compute the variance of $Z^2$:

Let $Z=(X-Y)^2$ where $X,Y$ are IID $\mathcal{U}(0,1)$ RVs. The variance of $Z^2$ is given by $$\sigma_{Z^2}^2=\mathbb{E}(Z^4)-\mathbb{E}(Z^2)^2,$$ using the well known formula for variance. Both of these expectations can be computed directly. Lets start with the lower power one. Applying the binomial theorem, $$Z^2=(X-Y)^4=X^4-4X^3Y+6X^2Y^2-4XY^3 +Y^4,$$ then taking expectations and using linearity and independence of $X$ and $Y$, $$\mathbb{E}(Z^2)=\mathbb{E}(X^4)-4\mathbb{E}(X^3)\mathbb{E}(Y)+6\mathbb{E}(X^2)\mathbb{E}(Y^2)-4\mathbb{E}(X)\mathbb{E}(Y^3)+\mathbb{E}(Y^4),$$ But since $X$ and $Y$ are also identically distributed uniform RVs on $(0,1)$, it follows that they have the same moments. So in turn this reduces to $$\mathbb{E}(Z^2)=2\mathbb{E}(U^4)-8\mathbb{E}(U^3)\mathbb{E}(U)+6\mathbb{E}(U^2)^2$$ where $U$ is any $\mathcal{U}(0,1)$ RV. All of these can now be computed directly as standard integrals and once that is over with, just remember to square the result. Repeat the process for $Z^4$ and you are done after substituting both results back into the variance formula.