I'm studying for a high school test on probability and came across this problem:
Jessica bought a crate of apples from a farm. The farmer told her that when randomly grabbing an apple from the crate, there's a 0.05 probability that it is rotten.
The question: If Jessica were to randomly grab 5 apples from the crate, what is the probability that exactly 2 would be rotten?
It doesn't say how many apples are in the crate(hence my confusion) so I simply assume that there are 100 apples for example. My calculation:
95C3 * 5C2 / 100C5 which equals to about 0.018. It says here that the correct answer is 0.021. Where did I go wrong? Any help would be awesome.
I see your logic, the chances of picking 2 rotten apples and 3 good apples.
I think this would work if there were exactly $100$ apples, but the probability smoothes out as you get a larger crate.
So one way you could do it is pick $3$ good apples, and then $2$ bad apples, which is what you did.
This would be accomplished with $(0.95)^3\dot(0.05)^2$ probability.
However, you have to account for all the various orders in which you could pick $3$ good apples and $2$ bad apples, so multiply by $\displaystyle \binom{5}{2}$, to get your answer of $10(0.95)^3(0.05)^2$.