Probability mass function, Bernoulli RVs

123 Views Asked by At

The question:

1) Verify whether this is True or False: $$ \sum_{k=0}^n \left( \frac{k}{n} -p \right) {n \choose k} p^k(1-p)^{n-k} = 0 $$

2) This is not relevant to Q1.

The question is if $X_1,X_2,\dots,X_n$ are Bernoulli (p) RVs, then the sum of all of them is a Binomial(n,p) RVs. Is this statement True or False?

My attempt:

1) I don't know how to derive the sum of the term $$\frac{k}{n} -p$$ Without this term, the sum of the other term is indeed the pmf of Binomial Distribution and it should be equal to 1 (I am not sure if it is a good idea to make it equal 1 straight away or if I have to manipulate it further somehow). And I guess this somehow cannot be equal to 0 (just guessing). But since I stuck with deriving the first term, I still can't conclude anything about the whole sum.

2) I know it looks true at first but it is indeed false since we missed the word ''independent''. My question is why the words independent Bernoulli RVs'' is so important here ?

ps: I am not sure what the green tick means (i am new to forum) since i appreciate anyone who helps me to answer but i realize i cannot tick for everyone so may i just vote up for everyone ? Is it ok?

Any help would be appreciated. Thank you :)

3

There are 3 best solutions below

2
On

Let $X$ be a random variable having binomial distribution with parameters $n$ and $p$.

Then:

$$\sum_{k=0}^{n}\left(\frac{k}{n}-p\right)\binom{n}{k}p^{k}\left(1-p\right)^{n-k}=\frac{1}{n}\sum_{k=0}^{n}kP\left(X=k\right)-p\sum_{k=0}^{n}P\left(X=k\right)=\frac{1}{n}\mathsf{E}X-p=$$$$\frac{1}{n}np-p=0$$

2) These words are important because they insure that $X:=\sum_{i=1}^nX_i$ has binomial distribution with parameters $n$ and $p$. If they are missing then for instance it is not excluded that we have $X_1=X_2=\cdots=X_n$ leading to a random variable $X=\sum_{i=1}^nX_i$ that only takes values in $\{0,n\}$, hence (for $n>1$) evidently not binomial.

0
On

I'll just answer the first question for now:

We have

$$ \sum_{k=0}^n ( \frac{k}{n}) {n \choose k} p^k(1-p)^{n-k} = \sum_{k=1}^n {n-1 \choose k-1} p^k(1-p)^{n-k} = \sum_{j=0}^m {m \choose j} p^{j+1}(1-p)^{m-j} \\= p\sum_{j=0}^m {m \choose j} p^{j}(1-p)^{m-j} = p$$

so that indeed

$$\sum_{k=0}^n \left( \frac{k}{n} -p\right) {n \choose k} p^k(1-p)^{n-k} \\= \sum_{k=0}^n \frac{k}{n} {n \choose k} p^k(1-p)^{n-k} - p\sum_{k=0}^n {n \choose k} p^k(1-p)^{n-k} = p-p= 0.$$

9
On

The question is if $X_1,X_2,\ldots,X_n$ are Bernoulli ($p$) RVs, then the sum of all of them is a Binomial($n,p$) RVs. Is this statement True or False?

That is true if $X_1\ldots,X_n$ are independent. How best to prove it might depend on where you are in learning about these things.

If you know that $$\sum_{k=0}^n k \binom n k p^k (1-p)^{n-k} = np\tag 1$$ then that's most of what you need to do to answer your first question. As to how $(1)$ is proved, some algebra can do it, and there are also probabilistic arguments. But again I don't want to go into any of that without knowing where you are in the process of learning this.