Sum of independent Gamma distributions is a Gamma distribution

119.9k Views Asked by At

If $X\sim \Gamma(a_1,b)$ and $Y \sim \Gamma(a_2,b)$, I need to prove $X+Y\sim\Gamma(a_1+a_2,b)$ if $X$ and $Y$ are independent.

I am trying to apply formula for independence integral and just trying to multiply the gamma function but stuck ?

2

There are 2 best solutions below

7
On BEST ANSWER

Now that the homework deadline is presumably long past, here is a proof for the case of $b=1$, adapted from an answer of mine on stats.SE, which fleshes out the details of what I said in a comment on the question.

If $X$ and $Y$ are independent continuous random variables, then the probability density function of $Z=X+Y$ is given by the convolution of the probability density functions $f_X(x)$ and $f_Y(y)$ of $X$ and $Y$ respectively. Thus, $$f_{X+Y}(z) = \int_{-\infty}^{\infty} f_X(x)f_Y(z-x)\,\mathrm dx. $$ But when $X$ and $Y$ are nonnegative random variables, $f_X(x) = 0$ when $x < 0$, and for positive number $z$, $f_Y(z-x) = 0$ when $x > z$. Consequently, for $z > 0$, the above integral can be simplified to $$\begin{align} f_{X+Y}(z) &= \int_0^z f_X(x)f_Y(z-x)\,\mathrm dx\\ &=\int_0^z \frac{x^{a_1-1}e^{-x}}{\Gamma(a_1)}\frac{(z-x)^{a_2-1}e^{-(z-x)}}{\Gamma(a_2)}\,\mathrm dx\\ &= e^{-z}\int_0^z \frac{x^{a_1-1}(z-x)^{a_2-1}}{\Gamma(a_1)\Gamma(a_2)}\,\mathrm dx &\scriptstyle{\text{now substitute}}~ x = zt~ \text{and think}\\ &= e^{-z}z^{a_1+a_2-1}\int_0^1 \frac{t^{a_1-1}(1-t)^{a_2-1}}{\Gamma(a_1)\Gamma(a_2)}\,\mathrm dt & \scriptstyle{\text{of Beta}}(a_1,a_2)~\text{random variables}\\ &= \frac{e^{-z}z^{a_1+a_2-1}}{\Gamma(a_1+a_2)} \end{align}$$

0
On

You may use a easier method. Consider the moment generating function or probability generating function. $E(e^{(X+Y)t} )=E(e^{Xt}e^{Yt})=E(e^{Xt})E(e^{Yt})$ as they are independent then we can get a moment generating function of a gamma distribution. Then you can find the mean and variance from the Moment generating function