How can I compute E[y]?

4.7k Views Asked by At

I have that $X$ is a RV with $P(X=1)=1-P(X=2)=p$ and $Y$ is RV with $ (Y|X=k)\stackrel{d}{=}\mathcal U[0,k]$

First of all, I try to find the joint density: $f(k,y)=f_x(k)f_{Y|X=k}=\frac{1}{2}.\frac{1}{k}=\frac{1}{2k}; k=1,2$

Then, the density of $y$, would be $f_y(y)=\sum \limits_{k=1}^{2}\frac{1}{2k}=\frac{3}{4}$

$E[y]=\int \limits_0^2yf_y(y)dy=\frac{3}{4}\int \limits_0^2ydy=\frac{3}{2}$

I would like to know if this is the way to do it, I am kind of confused since one is discrete and the other one is continuous.

4

There are 4 best solutions below

0
On

You have probability p that Y=U (0,2) and probability 1-p that Y=U (0,1). So E (Y)=p*1 + (1-p)0.5=0.5 (1+p).

1
On

If I understand your question correctly, you mean $Y\sim Uniform[0,k]$ conditional on $X=k$ and $X-1\sim$ Bernoulli(p).

You could always works with cdf and then differentiate it to get the density.

$$F_Y(y)=P(Y\leq y) =P(Y\leq y|X=1) P(X=1)+P(Y\leq y|X=2)P(X=2)$$$$=py 1_{\{0\leq y\leq 1\}}+ (1-p)\frac{y}{2}1_{\{0\leq y\leq 2\}}+p1_{\{y\geq 1\}}+(1-p)1_{\{y\geq2\}} $$

The density of $Y$ is then $$f_Y(y)=p1_{\{0\leq y\leq 1\}}+\frac{(1-p)}{2}1_{\{0\leq y\leq 2\}}$$

Then $$EY=\frac{p}{2}+(1-p)=1-\frac{p}{2}$$

Another really quick way is use the conditional expectation.

$$EY=E(E(Y|X))=p(E(Y|X=1)+(1-p)E(Y|X=2)=\frac{p}{2}+1-p=1-\frac{p}{2}$$

0
On

Since one is discrete and the other is continuous, there is no joint density. But there is a joint CDF, as there always is. It is actually fairly simple to define:

$$F(x,y) = \begin{cases} 0 & x < 1 \\ p \int_{-\infty}^y f(z) dz & 1 \leq x < 2 \\ p \int_{-\infty}^y f(z) dz + (1-p) \int_{-\infty}^y g(z) dz & x \geq 2 \end{cases}$$

where $f$ is the pdf of a $U(0,1)$ random variable and $g$ is the pdf of a $U(0,2)$ random variable.

You can use this with Stieltjes integration to compute the expected value.

An easier approach is to define: $B$ is Bernoulli(p), $U_1$ is $U(0,1)$, $U_2$ is $U(0,2)$, all three are independent, and $Y=B U_1 + (1-B) U_2$. Then $E[Y]=E[B] E[U_1] + E[1-B] E[U_2]$ by the independence and linearity.

0
On

Because $X$ is discrete and $Y\mid X$ is continuous, the joint function is a piecewise density function.  We can still work with this.   We sum over the discrete and integrate over the continuous, as we otherwise would.

$\begin{align} \mathsf P(X=k) & = \begin{cases}p & : k=1 \\ 1- p & : k=2\\ 0 & : \text{elsewhere}\end{cases} & \text{( Given )} \\[3ex] f_{Y\mid X}(y\mid k) & = \begin{cases}\frac 1 k & : y\in [0; k] \\ 0 & : \text{elsewhere}\end{cases} & (\;{Y\mid {X=k}}\;\,\sim\;\, \mathcal{U}[0;k]\;) \\[3ex] f_{Y,X}(y, k) &= f_{Y\mid X}(y\mid k)\; \mathsf P(X=k) & \text{( by definition )} \\[1ex] & = \begin{cases} p & : k=1, y\in [0;1] \\ (1-p)/2 & : k=2, y\in [0;2]\\ 0 & : \text{elsewhere}\end{cases} \\[3ex] \mathsf E(Y) & = \sum_{k\in \{1, 2\}} \int_0^k y\,f_{Y, X}(y, k)\operatorname d y \\[1ex] & = \int_0^1 p y \operatorname d y + \int_0^2 (1-p)y/2 \operatorname d y & \mathop{\color{red}{\boxed{\color{blue}{= p\,\mathsf E(Y\mid X=1)+(1-p)\,\mathsf E(Y\mid X=2)}}}}_{\text{which is the Law of Iterated Expectation}} \\[1ex] & = \frac p 2 + (1-p) \\[1ex] & = 1-\frac p 2 \end{align}$