Marginal P.M.F and Conditional Expectation?

622 Views Asked by At

I have a joint density function that is formulated as follows: $$ f_{X,Y}(k,y) = \begin{cases} \frac{\partial{P(X=k, Y\le y)}}{\partial y} = \lambda \frac{(\lambda y)^k}{k!}e^{-2\lambda y} & \text{for } k = 0,1,2,...\text{ and } y \in[0,\infty) \\ 0 & \text{elsewhere.} \end{cases} $$

I have two questions related to this joint probability mass - and density function.

First:

I am supposed to determine the marginal probability mass function of X, but I can't see how this will be done. Calculating the marginal p.m.f. requires that the function is discretely summing the values for y. Doing it continuously worked and I get a correct answer, but I need to do it discretely (p.m.f.). I tried as follows: $$ \sum_{y=0}^{\infty} \lambda \frac{(\lambda y)^k}{k!}e^{-2\lambda y} = \frac{\lambda ^{k+1}}{k!} \sum_{y=0}^{\infty} y^ke^{-2\lambda y} $$

But I couldn't simplify it from there. The answer is that the marginal p.m.f. of X belongs to a geometric distribution with p=0.5. How can this be shown?

Second:

I need to compute E[X | Y = y]. I am trying to use the following approach (the equality follows from a theorem in my litterature):

$$ E[X|Y=y]=\frac{\int\limits_{-\infty}^\infty xf_{X,Y}(x,y)dx}{\int\limits_{-\infty}^\infty f_{X,Y}(x,y)dx} $$ But I can't see how this is the correct approach (I get 0..)

Thanks!

Jam

2

There are 2 best solutions below

1
On BEST ANSWER

You have a fundamental misunderstanding of how a marginal distribution is computed. If $X$ is discrete and $Y$ is continuous, then the marginal distribution for $X$ is computed by integrating over the continuous support of $Y$, and the marginal distribution for $Y$ is computed by summing over the discrete support of $X$.

Thus, the correct calculation is $$\Pr[X = k] = \int_{y=0}^\infty \lambda \frac{(\lambda y)^k}{k!} e^{-2\lambda y} \, dy.$$ The marginal density of $Y$ is given by $$f_Y(y) = \sum_{k=0}^\infty \lambda \frac{(\lambda y)^k}{k!} e^{-2\lambda y}.$$ In each case, we integrate or sum over the "other" variable, where the choice of integration or summation follows the type of support that other variable has.

The conditional expectation is easily obtained as $$\mathrm{E}[X \mid Y = y] = \sum_{k=0}^\infty k \Pr[X = k \mid Y = y].$$ All that is needed is to realize that the conditional distribution of $X$ given $Y = y$ is obtained from the definition of conditional probability: $$\Pr[X = k \mid Y = y] f_Y(y) = f_{X,Y}(k,y),$$ which, if you have calculated the marginal density of $Y$, is easy to get and recognize.

1
On

I am supposed to determine the marginal probability mass function of X, but I can't see how this will be done. Calculating the marginal p.m.f. requires that the function is discretely summing the values for y.

No. The marginal probability mass function of the discrete random variable $X$ is obtained by integrating the joint probability mass and density function over the support of $Y$, the continuous random variable. (Menomically: $X$ has mass, $Y$ has density, and mass is summed while density is integrated.)

$$f_X(k) = \int_0^\infty f_{X,Y}(k, y) \operatorname d y$$

Similarly: $f_Y(y) = \sum\limits_{k=0}^\infty f_{X,Y}(k, y)$


For the conditional expectation you need: $\mathsf E[X\mid Y=y] = \sum\limits_{k=0}^\infty k \dfrac{f_{X,Y}(k,y)}{f_Y(y)}$