For n ∈ N, p ∈ (0, 1) let X ∼ Binom(n, p) and Y ∼ Ber(p) be two independent random variables. a) Determine the values P(X + Y = k) for k ∈ N. What is the distribution of X + Y ? Does this intuitively make sense?
I know the formula and how to do it, e.g., with two Poisson distributions. But I don't get it with a Bernoulli and Binomial distribution.
Is the sum from $k=0$ to $n$ instead of Infinity? Do I replace k with $m-k$ in the PMF of the binomial random variable?
If $X \sim \mathsf{Binom}(n, p)$ and independently $Y \sim \mathsf{Binom}(m, p),$ then $X+Y \sim \mathsf{Binom}(n+m,p).$ This is 'obvious' by @Jacobiman'2 Comment, and is trivial to prove formally using moment generating functions. [Of the Related links in the margin, this one may be helpful, but it is not an exact duplicate of your question.]
Furthermore, in your case, $Y \sim \mathsf{Ber}(p) \equiv \mathsf{Binom}(1,p),$ by definition. Then $X+Y \sim \mathsf{Binom}(n+1).$
In R, with $n=5$ and $p = 0.4$ we can illustrate this by simulation, With a million realizations of each random variable, we can expect sample means and variances to approximate population means and variances to about two or three decimal places.
Also a histogram of the simulated distribution of $W$ (blue bars) is well-matched by the PMF of $W$ (red dots).