Let $X$ and $Y$ be independent random variables, having the same probability mass function
$$p_{X}(k) = p_{Y}(k) = p(1-p)^{k-1}, k = 1,2,...$$
- Find the probability mass function of $X + Y$.
- Compute $E[X + Y]$
My approach to 1:
$p_{X+Y}(k) = P(X + Y = k) = P(X = 0, Y = k) + P(X = 1, Y = k-1) + ... + P(X = k, Y = 0) = \sum_{i=0}^k p_{X}(i)p_{Y}(k-i) = \sum_{i=0}^k p(1-p)^{i-1}p(1-p)^{k-i-1} = \sum_{i=0}^k p^{2}(1-p)^{k-2} = kp^{2}(1-p)^{k-2}$.
Does this look correct? This is from an old exam and there is no answer key.
2:
$E[X+Y] = \displaystyle \sum_{k \in \mathbb{N}} kP(X+Y = k) = \sum_{k \in \mathbb{N}} k^{2}p^{2}(1-p)^{k-2}$ is as far as I got. Don't know if this is right and where to go from here.
Alternatively: $X,Y$ are each the count for trials until the first success in independent sequences of Bernoulli trials with identical success rate $p$. Their sum is thus the count for trials until the second success in such a sequence of trials. We thus may evaluate the probability for a sequence of exactly $1$ successes among $k-1$ trials and then a subsequent success, to obtain the above probability mass function.