Proving a Gamma function identity in a probabilistic approach.

66 Views Asked by At

I want to prove an analysis result using a probabilistic approach. If $X \sim \Gamma(\alpha_1,\beta)$ and $Y \sim \Gamma(\alpha_2, \beta)$ then $ Z= X+Y \sim \Gamma(\alpha_1 +\alpha_2, \beta)$. While proving that $Z \sim \Gamma(\alpha_1 + \alpha_2, \beta)$ I am trying to show this $$ \int_0^1u^{\alpha_1 -1}(1-u)^{\alpha_2 -1}du = \frac{\Gamma(\alpha_1) \Gamma(\alpha_2)}{\Gamma(\alpha_1 +\alpha_2)} $$ However, I can't seem to figure it out. I used the the convolution to find the PDF of $Z$ but then I got stuck.

1

There are 1 best solutions below

1
On

Let $X \sim \text{Gamma}(\alpha_1, \beta)$ and $Y \sim \text{Gamma}(\alpha_2, \beta)$ be independent random variables.

The moment-generating functions (MGFs) of $X$ and $Y$ are $M_{X}(t) = \left(\dfrac{\beta}{\beta-t}\right)^{\alpha_1}$ and $M_Y(t) = \left(\dfrac{\beta}{\beta-t} \right)^{\alpha_2}$ respectively.

Recall that $M_{X+Y}(t) = M_X(t)M_Y(t)$, because $$M_{X+Y}(t) = \mathbb{E}[e^{t(X+Y)}] = \mathbb{E}[e^{tX+tY}]=\mathbb{E}[e^{tX}e^{tY}]=\mathbb{E}[e^{tX}]\mathbb{E}[e^{tY}]=M_{X}(t)M_{Y}(t)$$ where $\mathbb{E}[e^{tX}e^{tY}]=\mathbb{E}[e^{tX}]\mathbb{E}[e^{tY}]$ due to independence of $X$ and $Y$.

Hence,

$$M_{X+Y}(t)=\left(\dfrac{\beta}{\beta-t}\right)^{\alpha_1}\left(\dfrac{\beta}{\beta-t}\right)^{\alpha_2} = \left(\dfrac{\beta}{\beta-t}\right)^{\alpha_1+\alpha_2}$$ This is the MGF of a $\text{Gamma}(\alpha_1+\alpha_2, \beta)$ random variable; hence, $X+Y \sim \text{Gamma}(\alpha_1+\alpha_2, \beta)$.