Probability: Expected number of randoms from $0$ to $1$ needed to exceed $1$

140 Views Asked by At

I was taking a calculus competitive test, and I encountered this problem:

Henrik is randomly choosing numbers between $0$ and $1$ until the sum of all of the numbers that he has chosen exceeds $1$. What is the expected number of numbers Henrik will choose?

I spent a good amount of time on this, but I can't even think of a way to approach the problem.

I'm currently taking AP Calculus BC, so I know a good amount of calculus, but I feel like I'm missing the intuition to tackle problems like this where calculus is applied to probability.

Any help would be appreciated.

1

There are 1 best solutions below

0
On

I'm not sure how the question was intended to be answered but I suspect they wanted you to a formula for the expected sum.

From probability it would be the sum of the expected numbers randomly drawn so n (the number of draws) times the expected (average) number seen on a draw. This is where calculus comes in. If you are randomly drawing numbers from 0 to 1, to find the expected (average) number you draw, you integrate x times probability of drawing x over the range from 0 to 1. With all probabilities equally likely because it is a random draw, this means you integrate x times 1 dx from 0 to 1 to find the average number drawn. Then, take that answer and determine what n is so that n times that average number drawn is greater than 1.