Independent variables' probability functions and expectations

52 Views Asked by At

So, I've got this exercise

Two independent random variables $X,Y$ ~ Uniform $[0, 1]$. Find the probability function of the random variable $Z=X−Y$. Compute expectation of $E [Z]$

So, I need to find the probaiblity function for the difference $X$ and $Y$, right?

$f(c)$= $P(z≤c)$= $P(x−y≤c)$ for any $c ∈ ℝ$

Can someone compute it, please? I was somehow able to do the expectation, not sure, though. Can anyone check it?

$E(Z)=E(X−Y)=E(X)−E(Y)=0.5−0.5=0.$

Thank you.

2

There are 2 best solutions below

1
On

This result is independent of a value of $E(X)$ and $E(Y)$, because $X$ and $Y$ have the same distribution and thus $E(X)=E(Y)$, so by linearity of expectation $E(Z)=0$.

0
On

Since expectation (as an operator) is linear, the following statement is correct regardless of the independence between $X$ and $Y$: $$E(Z) = E(X − Y) = E(X) − E(Y) = 0.5 − 0.5 = 0.$$ However, if we do want to find the probability (density) function (pdf) of $Z = X - Y$, we can use the convolution formula to obtain this. Generally, the density of $Z = X - Y$ where $X$ and $Y$ are independent random variables is given by the convolution formula $$f_Z(z) = \int_{-\infty}^\infty f_{X,Y}(x, x-z)dx = \int_{-\infty}^\infty f_X(x)f_Y(x-z)dx.$$ In this question, we have $X, Y \overset{i.i.d.}{\sim} \text{Uniform }[0,1]$, so that the pdf of $X$ and $Y$ are given by $$f_X(x) = \mathbb{1}_{0 \leq x \leq 1} \quad \text{ and } \quad f_Y(y) = \mathbb{1}_{0 \leq y \leq 1}.$$ Since $X$ and $Y$ are independent, using the convolution formula we have $$f_Z(z) = \int_{-\infty}^\infty f_X(x)f_Y(x-z)dx = \int_0^1 f_Y(x-z)dx.$$ This integral is 0 unless $0 \leq x - z \leq 1$ ($z \leq x \leq z+1$), otherwise it is 0, which implies that $$f_Z(z) = \int_{\max\{0,z\}}^{\min\{1,z+1\}} 1dx.$$ By comparing the integral bounds, we end up the following cases.

Case 1: $-1 \leq z \leq 0$, then $\max\{0,z\} = 0$ and $\min\{1,z+1\} = z+1$, so that $$f_Z(z) = \int_0^{z+1} 1dx = z+1.$$ Case 2: $0 < z \leq 1$, then $\max\{0,z\} = z$ and $\min\{1,z+1\} = 1$, so that $$f_Z(z) = \int_z^1 1dx = 1-z.$$ For $z$ smaller than -1 or greater than 1 the density is zero. To sum up, the probability density function of $Z = X - Y$ is triangular, given by $$f_Z(z) = \begin{cases} z+1 & \text{if } -1 \leq z \leq 0;\\ 1-z & \text{if } 0 < z \leq 1; \\ 0 & \text{otherwise}. \end{cases}$$ Lastly, we can use the pdf of $Z$ to calculate its mean, that is, \begin{align*} E[Z] &= \int_{-\infty}^\infty zf_Z(z)dz\\ &= \int_{-1}^0 zf_Z(z)dz + \int_0^1 zf_Z(z)dz\\ &= \int_{-1}^0 z(z+1)dz + \int_0^1 z(1-z)dz\\ &= \int_{-1}^0 (z^2+z)dz + \int_0^1 (z-z^2)dz\\ &= \left[\frac{z^3}{3} + \frac{z^2}{2}\right]\bigg|_{-1}^0 + \left[\frac{z^2}{2} - \frac{z^3}{3}\right]\bigg|_0^1\\ &= 0 - \left(-\frac{1}{3} + \frac{1}{2}\right) + \left(\frac{1}{2} - \frac{1}{3}\right) - 0\\ &= \frac{1}{3} - \frac{1}{2} + \frac{1}{2} - \frac{1}{3}\\ &= 0, \end{align*} as desired.