Let $\mathcal{U}_{1} , \mathcal{U}_{2}, \ldots $ be a sequence of independent, uniform random variables on $(0,1)$. Find $E[N]$ when $N = \min{\left\{n: \sum_{i=1}^{n} \mathcal{U}_{i} > 1 \right\}} $.
Let $N$ be the number of uniform random variables that need to be summed to exceed $1$.
$$ N(x) = \min {\left\{n : \sum_{j=1}^{n} \mathcal{U}_{i} > x \right\}}$$ Let $m(x) = E[N(x)] = E \ [E \ [N(x)\mid \mathcal{U}_{1} ] \ ]= \int_{0}^{1} E \ [ N(x) \mid \mathcal{U}_{1} = y] \ dy $, where $$ E \ [ N(x)\mid \mathcal{U}_{1} = y] = \begin{cases}1 & \text{if $y>x$}\\ 1 + E[(n-y)] & \text{if $y \leq x$}\end{cases}$$
I have a conceptual problem understanding the motivation behind the expression for the second case above. Would someone be willing to shed some light?
$$E \ [ N(x)\mid \mathcal{U}_{1} = y]=\begin{cases}1 & \text{if $y>x$}\\1 + m(x-y) & \text{if $y \leq x$}\end{cases} = 1 + \begin{cases}0 & \text{if $y>x$}\\ m(x-y) & \text{if $y \leq x$}\\\end{cases}$$ and so $$ m(x) = 1 + \int_{0}^{x} m(x-y) \ dy \implies m'(x) = 0 + m(0) + \int_{0}^{x} m'(x-y) \ dy $$ We perform a change of variables with $u = x -y$ and $du = - dy$
$$ m'(x)= m(0) - m(x-y) \ \bigg\vert_{0}^{X} $$ We substitute the initial value $ m(0) = 1 $ and simplify the antiderivative: $$ m'(x) = m(0) - [m(0) - m(x)] \implies m(x) = C_{0} e^{x} $$ where $ C_{0} = 1 $, so $ m(x) = e^{x}$ and $$ m(1) = e $$
This result seems pretty neat since the $e$ comes in without any mention of the constant anywhere in the question. However, I can't appreciate it without understanding the origin of the $1 + m(x-y)$ term.
If $\mathcal U_1=y\ge x$ then you are already there, so needed to "sum up" only $1$ random variable, so $$E[N(x) \mid \mathcal U_1=y>x]=1$$ Now, if $\mathcal U_1=y<x$ then you have "summed up" $1$ random variable and in order to reach $x$ you are still $x-y$ short. So, the expected number of r.v's to reach $x$ is $1+$ the expected number of r.v's to reach the remaining $x-y$. In your notation $1+m(x-y)$, or formally $$E[N(x)\mid \mathcal U_1=y<x]=1+m(x-y)$$
Indeed, I find it a neat result too by the way.