Statistics: Deriving a Joint Probability Function From a Definition of Other PDF's

1.7k Views Asked by At

Here's a particular question I'm trying to understand from the lecture notes.
It says:

Assume that $Y$ denotes the number of bacteria per cubic centimeter in a particular liquid and that $Y$ has a Poisson distribution with parameter $x$. Further assume that $x$ varies from location to location and has an exponential distribution with parameter $β = 1$.
-Find $f_{X,Y}(x,y)$, the joint probability function of $X$ and $Y$.

In the lecture slides, it says: $$f_{X,Y}(x,y) = f_{Y\mid X}(y\mid x)f_X(x)$$ Where $f_{Y\mid X}(y\mid x)$ is the PDF for the Poisson distribution, and $f_X(x)$ is the PDF for the exponential with $\beta = 1$.

I'm not sure how $f_{X,Y}(x,y) = f_{Y\mid X}(y\mid x)f_X(x)$ came to be. Is there anything for me to look for in the question that would hint to using this form?

Any help would be appreciated.
Thanks.

4

There are 4 best solutions below

2
On BEST ANSWER

The hints in the problem:

  • You are being asked by the joint PDF of $Y$ and $X$, $f_{Y,X}(y,x)$. From here one thing must be clear: there are two random variables involved in the problem. How could you compute this in general? Two cases are important here:

    • If they are independent, $f_{Y,X}(y,x) = f_X(x)f_Y(y)$, and some information should be given to compute those marginals PDFs.
    • If they are not independent, $f_{Y,X}(y,x) = f_{Y\mid X}(y \mid x)f_X(x)$ and some information should be given to compute the corresponding PDFs (one conditional and the other one marginal).
  • We are told that the parameter $x$ in the distribution (PDF) of $Y$ is varying and has its very own distribution! That is, the other involved random variable, $X$, is the parameter in the distribution of $Y$, what a relationship!...wait a minute...that means that $Y$ depends on $X$! Moreover, given that $X$ takes some value $x$, $Y$ has Poisson distribution with paramater $x$, that is

$$f_{Y \mid X}(y \mid x) = e^{-x}\frac{x^y}{y!}$$

0
On

Just like there is a product rule in the discrete case, $$P(X,Y) = P(Y\mid X)P(X),$$ there is also a product rule in the continuous case $$f_{X,Y}(x,y) = f_{Y\mid X}(y\mid x)f_X(x).$$

So, the instructor is saying that to get the joint density, you can multiply the density of $Y$ given $X$ by the density of $X$.

1
On

This follows from definition of conditional probability. If there are two events $A,B$, then the probability of their joint occurrence is $P(A\cap B)$, conditional probability of occurrence of $B$ given that event $A$ has taken place is written as $P(B\lvert A)$, which is defined as \begin{equation} P(B\lvert A)=\frac{P(A\cap B)}{P(A)} \end{equation} Now if you map the elements of event space to real numbers, this mapping is called random variables. Then we can define distribution of these random variables. From the distribution we can define density of random variables. Let $f_X(x)$ be density of random variable $X$, $f_Y(y)$ of r.v. $Y$ then their joint density is written as $f_{X,Y}(x,y)$. Now following the definition of conditional probability, we can write conditional distribution of $Y\lvert X$ as \begin{equation} f_{Y\lvert X}(y\lvert x)=\frac{f_{X,Y}(x,y)}{f_X(x)} \end{equation} Hope this clears the doubt. This definition is quite general, hence holds for the scenario you have mentions. In nut shell, in the lecture series the author is just using definition of conditional distribution which follows from conditional probability definition.

2
On

Is it true that if we extended this to the case for a random vector (X,Y,Z) with a pdf $f_{X,Y,Z}(x,y,z)$, (also X, Y, Z have their own pdf's with the same notation as above) we would have the conditional expectation be equal to: $$f_{X,Y|Z}(x,y|z) = \frac{f_{X,Y,Z}(x,y,z)}{f_{Z}(z)}?$$