What is the distribution of $E[X\mid Y]$?

244 Views Asked by At

Let $(X, Y)$ be two r.v. with joint p.m.f. described by the following table

enter image description here

  1. What is the marginal distribution of $X$?
  2. Are $X$ and $Y$ independent?
  3. What is the conditional p.m.f. of $X$ given $Y=0$?
  4. What is the distribution of $E[X\mid Y]$?

My attempt

  1. Marginal distribution of $X$ can be found by summing the columns. So $$\mathbb{P}(X=0)=\frac{1}{15}+\frac{3}{15}+\frac{2}{15}=\frac{6}{15}$$ $$\mathbb{P}(X=1)=\frac{2}{15}+\frac{4}{15}+\frac{3}{15}=\frac{9}{15}$$
  2. They are not independent because for example, $$\mathbb{P}(X=0\cap Y=0)\neq \mathbb{P}(X=0)\mathbb{P}(Y=0)$$
  3. Conditional p.m.f. for $X$ is given by the formula $$p_{X\mid Y}(x,y)=\frac{p_{X,Y}(x,y)}{p_Y(y)}$$ Thus, $$\mathbb{P}(X=0\mid Y=0)=\frac{1}{3}$$ $$\mathbb{P}(X=1\mid Y=0)=\frac{2}{3}$$
  4. I am not sure what it means by what is the distribution of $\mathbb{E}[X\mid Y]$ I know the formula is $$\mathbb{E}[X\mid Y=y]=\sum_{x\in X(\Omega)}xp_{X\mid Y}(X\mid Y)$$ and I can use that to find it for each of $Y=0,1,2$, but how do I find the distribution?
3

There are 3 best solutions below

3
On BEST ANSWER

$E(X\mid Y)$ is a function of the random variable $Y$. If you let it be $g(Y)=E(X\mid Y)$, then you could find the pmf.

$$P(g(Y)=g(0))=P(Y=0)\\ P(g(Y)=g(1))=P(Y=1)\\ P(g(Y)=g(2))=P(Y=2)$$

$$P\left(g(Y)=\frac 23\right)=\frac 15\\ P\left(g(Y)=\frac 47\right)=\frac 7{15}\\ P\left(g(Y)=\frac 35\right)=\frac 13$$

0
On

You have

  • $E[X \mid Y=0] = \frac23$
  • $E[X \mid Y=1] = \frac47$
  • $E[X \mid Y=2] = \frac35$

$E[X \mid Y]$ is therefore a function $Y$ taking the value $\frac23$ when $Y=0$, $\frac47$ when $Y=1$, and $\frac35$ when $Y=2$

In a sense that function is the answer, but you could invent any function which satisfies that, such as saying $$E[X \mid Y] = \dfrac{13Y^2-33Y+140}{210}$$ when $Y\in \{0,1,2\}$

Since $Y$ is a random variable, you can also say $E[X \mid Y]$ is also a random variable, as Stacker has done in giving its distribution.

0
On

If we compare terminology in the first and third parts of the question (distribution versus p.m.f.), then there is additional work to be done for the first and fourth parts. Namely, the first part requires the distribution function $F_{X}(x),$ which computes the probability that the random variable $X$ does not exceed $x,$ and the fourth part requires the distribution function $F_{g(Y)}(y),$ which computes the probability that the random variable $g(Y)$ does not exceed $y,$ where I have used the notation in Stacker's solution; see Chapter 2 section 1 of Grimmett and Stirzaker for more details.

Continuing from your computation, we have that \begin{equation} F_{X}(x) = \begin{cases} 0, & \text{if $x < 0$}\\ \frac{2}{5}, & \text{if $0 \leq x < 1$}\\ 1, & \text{if $x \geq 1$} \end{cases} \end{equation} where $\lim_{x \to - \infty} F_{X}(x) = 0,$ $\lim_{x \to \infty} F_{X}(x) = 1,$ if $x < y,$ then $F_{X}(x) \leq F_{X}(y),$ and $F_{X}$ is right continuous. Now, continuing from the computation in Stacker's solution, we have that \begin{equation} F_{g(Y)}(y) = \begin{cases} 0, & \text{if $y < 0$}\\ \frac{1}{5}, & \text{if $0 \leq y < 1$}\\ \frac{2}{3}, & \text{if $1 \leq y < 2$}\\ 1, & \text{if $y \geq 2$} \end{cases} \end{equation} where $\lim_{y \to - \infty} F_{g(Y)}(y) = 0,$ $\lim_{y \to \infty} F_{g(Y)}(y) = 1,$ if $x < y,$ then $F_{g(Y)}(x) \leq F_{g(Y)}(y),$ and $F_{g(Y)}(y)$ is right continuous.

As an aside, you could include a fifth part of the question, where you compute the expectation $\mathbb{E}[X]$ via your findings in the first part. Then, you compute $\mathbb{E}[X] = \mathbb{E}[\mathbb{E}[X|Y]] = \mathbb{E}[g(Y)]$ via the findings in the fourth part (and with help from the law of the unconscious statistician). The latter is known as the law of total expectation.