How can I see this:
$$\sum_{n=k}^\infty {n \choose k} p^{n-k} (1-p)^{k+1}=1$$
What probability distribution that I can connect the above with? negative binomial?
How can I see this:
$$\sum_{n=k}^\infty {n \choose k} p^{n-k} (1-p)^{k+1}=1$$
What probability distribution that I can connect the above with? negative binomial?
Copyright © 2021 JogjaFile Inc.
The way I looked at this was as follows:
First of all notice the following. You probably know the binomial distribution as follows: if we repeat a bernoulli experiment (i.e. a coin toss), with a succes-probabillity $p$ and a fail-probabillity $q=1-p$, $n$ times. The chance then of having $k$ successes (in $n$ tries) is given by the probabillity mass function: \begin{align} \mathbb{P}(X=k) = \binom{n}{k} p^k(1-p)^{n-k} \end{align} Now notice that since $q=1-p$, it's also equivalently true that $p=1-q$. We could then also write: \begin{align} \mathbb{P}(X=k) = \binom{n}{k} q^k(1-q)^{n-k} \end{align} Where $\mathbb{P}(X=k)$ is the chance we try $n$ times and fail $k$ times. But using the the fact again that $q=1-p$, we can then rewrite: \begin{align} \mathbb{P}(X=k) = \binom{n}{k} (1-p)^kp^{n-k} \end{align} Now this gives us some insight in the sum you presented, but we're still not quite there. Your summand is: \begin{align} \binom{n}{k}(1-q)^{n-k}q^{k+1} = q\binom{n}{k}(1-q)^{n-k}q^k \end{align} I've rewritten for $q$ because that makes the expression a little bit clearer to disect. I've also factored a term $q$ out, it now reads $q$ (the chance of failing the experiment) times $\mathbb{P}(X=k)$ (the chance of failing $k$ times in n tries). So your summand gives the probabillity that we try $n$ times and in those $n$ times fail $k$ times, and then try a $n+1$-th time and fail. Or in other words: your summand gives the chance you repeat an experiment $n+1$ times and fail exactly for the $k+1$-th time on the $n+1$-th (last) try (the last fail is on the last try).
Now let's sum over this term over $n$. We can ommit the first $k-1$ terms, because they are meaningless (these terms ask the question: what is the chance we fail $k$ times in less than $k$ tries, which is of course impossible). So we start our sum from $n=k$: \begin{align} \sum_{n=k}^{\infty} q\binom{n}{k}q^k(1-q)^{n-k} \end{align} Now let's think about what this is saying. Let's not think about summing yet, but just about what happens: we are fixing $k$; it's the fixed number of failures we have in $n$ tries, and we are considering the chance that we then also fail an additional time in an extra try, for instance; fix $k=2$ we then ask the question: what is the chance that a third fail occurs on the $n+1$-th try. Now think about every $n$ as a seperate case:
and so on... Then if we take $n$ to infinity: what is the chance that one of these cases occurs? Well: 1, because if we toss the coin an infinite times there will have been a case where we failed for the third time on that $n+1$-th try. In other words: we toss a coin a couple of times and have just failed for the second time: we now keep tossing the coin until we eventually fail a third time: this will happen if we just keep trying long enough: so the chance of this occuring is 1 (also notice that all tries after the case that we fail our third time are irrelivant, because, sure, we can keep tossing te coin, but we are never going to fail again for the third time).
So you can now see that if we sum over all these cases: the sum will be 1, because in one of the cases we will fail for the third time; if we add all the probabillities of all the cases, they must sum to 1 because, again: in one of the cases we will fail for the third time.
If we now generalise to $k$: what is the chance of having a $k+1$-st failure on some $n+1$-st try, where $n$ is some number between $k$ and infinity? Again: it's going to be one.
As for what distrubution this is; I'm not really sure. It is some combination of a binomial distribution and a geometric distribution. Also, if you're more into just brute force math, have a look at this question: Infinite binomial sum.
Hope this helps!