PMF of sum of independent RVs

74 Views Asked by At

Let $X$ and $Y$ be independent random variables, having the same probability mass function

$$p_{X}(k) = p_{Y}(k) = p(1-p)^{k-1}, k = 1,2,...$$

  1. Find the probability mass function of $X + Y$.
  2. Compute $E[X + Y]$

My approach to 1:

$p_{X+Y}(k) = P(X + Y = k) = P(X = 0, Y = k) + P(X = 1, Y = k-1) + ... + P(X = k, Y = 0) = \sum_{i=0}^k p_{X}(i)p_{Y}(k-i) = \sum_{i=0}^k p(1-p)^{i-1}p(1-p)^{k-i-1} = \sum_{i=0}^k p^{2}(1-p)^{k-2} = kp^{2}(1-p)^{k-2}$.

Does this look correct? This is from an old exam and there is no answer key.

2:

$E[X+Y] = \displaystyle \sum_{k \in \mathbb{N}} kP(X+Y = k) = \sum_{k \in \mathbb{N}} k^{2}p^{2}(1-p)^{k-2}$ is as far as I got. Don't know if this is right and where to go from here.

2

There are 2 best solutions below

0
On BEST ANSWER
  1. Your approach is correct, but note that the support for $X,Y$'s distributions are non-zero natural numbers, and so the support for their sum's distribution shall be natural numbers of at least 2. Adjustments must be made to the bounds for your series. $$\begin{align}\mathsf P(X+Y=k) ~&=~ \mathbf 1_{k\in[[2;\infty)]}\sum_{j=1}^{k-1} \mathsf P(X=j)\,\mathsf P(Y=k-j)\\[1ex]&=~ \mathbf 1_{k\in[[2;\infty)]}\sum_{j=1}^{k-1}p^2(1-p)^{j-1+k-j-1}\\[1ex]&=~ p^2\mathbf 1_{k\in[[2;\infty)]}\sum_{j=1}^{k-1}(1-p)^{k-2}\\[1ex]&=~ (k-1)p^2(1-p)^{k-2}\mathbf 1_{k\in[[2;\infty)]}\end{align}$$

Alternatively: $X,Y$ are each the count for trials until the first success in independent sequences of Bernoulli trials with identical success rate $p$. Their sum is thus the count for trials until the second success in such a sequence of trials. We thus may evaluate the probability for a sequence of exactly $1$ successes among $k-1$ trials and then a subsequent success, to obtain the above probability mass function.

  1. Well you could try to find a closed form for the expectation using that pmf, but the Linearity of Expectation would be more helpful ... if you know the expectation for a 1-Shifted Geometric Distributed Random Variable. $$\mathsf E(X)=1/p$$
0
On

Answer to Question 1 - have a look at the answers to this question: How to compute the sum of random variables of geometric distribution

Answer to Question 2 - For $X$ and $Y$ being random variables, $E(X+Y) = E(X)+E(Y)$. This is always true, even when $X$ and $Y$ are not independent, given the expected values exist. And they do exist in the case of $X$ and $Y$ having a geometric probability mass function (PMF).