Supposedly a baker mixes 5000 raisins into a large quantity of dough which he mixes well and bakes 1000 equally sized cookies from.
So, I'm going to define a random variable X as number of raisins in a randomly selected cookie. But I'm not sure whether if I should use normal or uniform distribution? X=0 would be a rare occurrence so it would not make sense to use uniform distribution. If I try to use normal distribution, then how would I calculate standard deviation if it isn't given?
Edit: Ok, I left out other part of the problem. I am supposed to find the approximate probability that exactly ten cookies out of 20 randomly selected cookies have each at least four raisins in them.
With this additional information, would it make sense to use hypergeometric distribution instead? X would then be re-defined to the number of cookies out of 20 randomly selected cookies that each contains at least four raisins. Then $P(X=10)$ would be what I want, but it leaves out something about the number of raisins that each cookie contains.