Convolution formula proof- Random discrete varaiables

881 Views Asked by At

Let X, Y be discrete random variables and take values at $1, 2, · · · , n, · · · $

$f_{X}(t)=\sum_{k=0}^{k=inf} P(X=k)x^{k}$ is the probability generating function.

and this result was given below

$f_{X+Y}(x)=f_{X}(x). f_{Y}(x)$

Can someone explain me how they reached to this formula which is known as convolution formula

2

There are 2 best solutions below

0
On BEST ANSWER

This is true if $X,Y$ are independent. If $X,Y$ are independent, then $$f_{X+Y}(z)=E[z^{(X+Y)}] = E[z^{X}z^{Y}] = E[z^{X}] E[z^{Y}] = f_X(t)f_Y(t).$$

The "convolution formula" is usually just referring to the distribution of $Z = X+Y$ when $X,Y$ are independent. In this case $$ P_Z(k) = P[Z=k] =\sum_{i} P[X=i, Y=k-i] = \sum_i P[X=i] P[Y=k-i] = (P_X * P_Y)(k). $$

0
On

For any sequences $\{a_k\}_{k\ge 0}$ and $\{b_j\}_{j\ge 0}$ we can define the ordinary power generating function

$$G_a(t)=\sum_k a_k t^k,\text{ and }G_b(t)=\sum_j b_j t^j$$

If we now define the sequence $\{c_h:c_h=\sum_{\{j+k=h\}}a_{h-j}b_j\}_{h\ge0}$ then it generating function will be

$$G_c(t)=G_a(t)G_b(t)$$

This is just a Cauchy product. Apply it to the case of the multiplication of the generating functions for the mass function of independent random variables and you get the generating function of the sum.