Let $a < b < c$ randomly chosen numbers such that $b > 0$.
Find an approximation for $k$, the number of times it is necessary to choose a random number in the interval $(a, b)$ so the sum of the chosen numbers will be greater than $c$.
I don't even know where to start. I wrote a script in R but I don't know how to formulate it to find a mathematical approximation.
tries <- 10^4
i <- 0
thislist <- list(0)
while (i < tries){
i <- i + 1
n <- runif(1, a, b)
k <- 0
while (n < c){
k <- k + 1
n <- n + runif(1, a, b)
}
thislist <- append(thislist, k)
print(i)
}
print(mean(unlist(thislist)))
I will assume $a \ge 0$.
The average of each selection is $\frac{a+b}{2}$. If we always get the average, then it will take $\frac{2c}{a+b}$ tries to exceed $c$.
If you modify the choices of $0 \le a < b < c$ below, you will find that the bulk of the distribution of $k$ is near $\frac{2c}{a+b}$, and that the mean of $k$ is also near $\frac{2c}{a+b}$.