Applying the basic formula for binomial distribution

437 Views Asked by At

I'm pretty confused on how this works. In my class my teacher states that:

Let $X$ be a random variable with $S_X = \{0,1\}$. $X$ follows a Bernoulli distribution if $P(X = x) = p^x(1-p)^{1-x}$ for some $p$ in $(0,1)$

Suppose a student randomly guesses on four multiple choice questions with five possible choices. Let $X_i = 1$ if the $i^{th}$ question is correctly answered. Let $Y = \sum_{i=1}^4 X_i$ , the number of correct guesses out of four.

By letting $p = 0.2$, the probability distribution of $Y$ follows:

$P(Y = 0) = 1 \times p^0 \times (1 - p)^4 = 0.4096$

.. etc

My question is, how is the formula above being used? Somehow, $p^x$ transforms into $p^0$ which makes sense, but then if $x = 0$ in this case, how does $1 - 0 = 4$?

3

There are 3 best solutions below

2
On BEST ANSWER

$Y=0$ if and only if $X_1=X_2=X_3=X_4=0$. Since the four random variables are independent, the probability this happens is $$\Pr[X_1=0]\cdot\Pr[X_2=0]\cdot\Pr[X_3=0]\cdot\Pr[X_4=0]$$ which is equal to $$ (p^0(1-p)^{1-0})\cdot(p^0(1-p)^{1-0})\cdot(p^0(1-p)^{1-0})\cdot(p^0(1-p)^{1-0}) = p^0(1-p)^{4} $$

0
On

The sum of n independent Bernoulli random variables, all with the same p, is a binomial random variable with parameters n and p.

For the case $n=4$, here are a few of the probabilities in the distribution:

$P(Y = 0) = (1-p)^4,$ as you already know.

$P(Y = 1) = 4p(1-p)^3,$ as in the hint from @Alex. You must take into account all of the sequences 1000, 0100, 0010, and 0001, for values of the $X_i.$

$P(Y = 2) = 6p^2(1-p)^2,$ because there are six possible arrangements of two 0s and two 1s. And so on for $P(Y = 3)$ and $P(Y = 4).$

Notes; (1) You should look up (or ahead in your book to) the Binomial distribution. (2) Also looking beyond this problem, moment generating functions are one convenient way to show that the sum of Bernoullis with the same p is Binomial. (3) Of course all five of these probabilities for n = 4 must add to 1. For that, consider the "binomial expansion" of $(p + q)^n,$ where $0 \le p \le 1$ and $q = 1-p.$

1
On

In my class my teacher states that:

Let $X$ be a random variable with $S_X=\{0,1\}$. $X$ follows a Bernoulli distribution if $P(X=x)=p^x(1−p)^{1−x}$ for some $p$ in $(0,1)$.

Doubtful. Your teacher should have stated:

Let $X$ be a random variable with support, $S_X=\{0, 1\}$.   Then $X$ has a Bernoulli distribution of $\mathsf P(X=1) = p,\; \mathsf P(X=0)=(1-p)$ for some $p$ in the interval of $(0;1)$.

Let $\{X_i\}_n$ be a sequence of $n$ independent and identically Bernoulli distributed random variables, with parameter $p$.   Let $Y=\sum_{i=1}^n X_i$, the sum of the results of that sequence.   $Y$ is a random variable with a support of $S_Y=\{0 ; n\}$, and a Binomial Distribution of: $$\mathsf P(Y = k) = \binom{n}{k} p^k(1-p)^{n-k}$$

The origin of this formula is that the $n$ random variables ($X_i$) can represent a sequence of $n$ independent trials which may each result in a failure or a success (result $0$ or $1$ respectively). $Y$ would then represents the count of successes.

  • The probability of $k$ successes in a row is: $p^k$
  • The probability of $n-k$ failures in a row is: $(1-p)^{n-k}$
  • The count of distinct arrangements of $k$ successes and $n-k$ failures is $\binom{n}{k}$

Hence the probability that in these $n$ trials we will have an arrangement of $k$ successes and $n-k$ failures is ...