pdf of Linear Combination of the same random variable

40 Views Asked by At

Let's say that a random variable X has a probability p to be Gamma($\alpha,\beta$) and a 1-p probability to be $\chi^2$(r). How do I prove that $f_x(x) = p \frac{1}{\Gamma(\alpha)\beta^\alpha}x^{\alpha-1}e^{-\frac{x}{\beta}}$ + $(1-p) \frac{1}{\Gamma(\frac{r}{2})2^{\frac{r}{2}}}x^{\frac{r}{2}-1}e^{-\frac{x}{2}}$?

It makes sense intuitively, but I can't think of a way to prove this to be the case

1

There are 1 best solutions below

1
On BEST ANSWER

We can make an analogy to flipping a coin. Defining the following random variable $$f_{X}(x) =\begin{cases} Y & ,x = head \\ Z & ,x = tails \end{cases}$$ with $P(H) = p$ and $P(T) = 1-p$, and $Y \sim Gamma(\alpha,\beta)$, $Z \sim \chi_{(r)}^{2}$. Then we have \begin{align*} F_{X}(x) & = P(X \leq x) \\ & = P(X \leq x| X = head)P(X = head) +P(X \leq x| X = tails)P(X = tails) \\ & = P(Y \leq x)P(X = head) +P(Z \leq x)P(X = tails) \\ & = F_{Y}(y)p + F_{Z}(x)(1-p) \\ F_{X}(x) & = F_{Y}(y)p + F_{Z}(x)(1-p) \\ f_{X}(x) & = pf_{X}(x) + f_{Z}(x)(1-p) \\ & = p Gamma(\alpha,\beta) + \chi_{(r)}^{2}(1-p) \end{align*} I can't think of anything more formal than this, and obviously removing the analogy