Convolution of random variables - Bernoulli and Binomial

483 Views Asked by At

For n ∈ N, p ∈ (0, 1) let X ∼ Binom(n, p) and Y ∼ Ber(p) be two independent random variables. a) Determine the values P(X + Y = k) for k ∈ N. What is the distribution of X + Y ? Does this intuitively make sense?

I know the formula and how to do it, e.g., with two Poisson distributions. But I don't get it with a Bernoulli and Binomial distribution.

Is the sum from $k=0$ to $n$ instead of Infinity? Do I replace k with $m-k$ in the PMF of the binomial random variable?

1

There are 1 best solutions below

0
On

If $X \sim \mathsf{Binom}(n, p)$ and independently $Y \sim \mathsf{Binom}(m, p),$ then $X+Y \sim \mathsf{Binom}(n+m,p).$ This is 'obvious' by @Jacobiman'2 Comment, and is trivial to prove formally using moment generating functions. [Of the Related links in the margin, this one may be helpful, but it is not an exact duplicate of your question.]

Furthermore, in your case, $Y \sim \mathsf{Ber}(p) \equiv \mathsf{Binom}(1,p),$ by definition. Then $X+Y \sim \mathsf{Binom}(n+1).$

In R, with $n=5$ and $p = 0.4$ we can illustrate this by simulation, With a million realizations of each random variable, we can expect sample means and variances to approximate population means and variances to about two or three decimal places.

set.seed(2021)
x = rbinom(10^6, 5, 0.4)
y = rbinom(10^6, 1, 0.4)
w = x + y
mean(x); mean(y); mean(w)
[1] 2.00112    # aprx E(X) = 5(.4) = 2
[1] 0.399865   # aprx E(Y) = 1(.4) = 0.4
[1] 2.400985   # aprx E(W) = 2+0.4 = 2.4 = 6(.4)
var(x); var(y); var(w)
[1] 1.201      # aprx Var(X) = 1.2
[1] 0.2399732  # aprx Var(Y) = 0.24
[1] 1.441141   # aprx Var(W) = 1.44

Also a histogram of the simulated distribution of $W$ (blue bars) is well-matched by the PMF of $W$ (red dots).

hist(w, prob=T, br=seq(-.5,6.5), col="skyblue2", main="BINOM(6,.5)")
 k = 0:6; pdf = dbinom(k, 6, 0.4)
 points(k, pdf, pch=20, col="red")

enter image description here