(The question is extracted from Casella and Berger, Statistics Inference exercise $5.58$)
Suppose that $U_1,U_2,...U_n$ are iid uniform $(0,1)$ random variables, and let $S_n=\sum_{i=1}^nU_i$. Define the random variable $N$ by $$N=\min\{k:S_k>1\}$$ (a) Show that $P(S_k\leq t)=t^k/k!$
Given that I know how to show $E(N)=e$,
(b) How large should $n$ be so that you are $95\%$ confident that you have the first four digits of $e$ correct?
We solve (a) with a proof by induction. While the case $k=0$ is trivial for $t>0$ because $S_0=0$, moving from $k=n$ to $k=n+1$ uses a convolution, viz. $\int_0^t\frac{u^{n-1}}{(n-1)!}(t-u)du=\frac{t^{n+1}}{(n+1)!}$. Note the first factor in the integrand is the PDF, not CDF, of the inductive hypothesis.
For (b) note $P(N=n)=\mathbb{E}P(1-U_n< S_{n-1}\le 1)=\mathbb{E}\frac{1-(1-U_n)^{n-1}}{(n-1)!}=\frac{n-1}{n!}$ (as a sanity check, a telescoping sum on $n\ge 2$ verifies unitarity). Thus $\mathbb{E}N=\sum_{n\ge 2}\frac{1}{(n-2)!}=e$, as you already knew, while $\mathbb{E}N^2=\sum_{n\ge 2}\frac{n}{(n-2)!}=2+\sum_{n\ge 3}(\frac{1}{(n-3)!}+\frac{2}{(n-2)!})=2+e+2(e-1)=3e$ so $\operatorname{Var}N=e(3-e)$. A Normal approximation is all you need to finish the problem.