I have the sequence $(y_n)_{n\in\mathbb N_0}$ of functions $$y_n\colon [0,\alpha] \to \mathbb R$$ defined recursively by $$ y_{n+1}(x) = \int_0^x g\bigl(y_{n}(\xi)\bigr)\,d\xi,\qquad n\in\mathbb N $$ and the constant function $y_0(x)\equiv 0$. We know nothing about the function $g$, except that all those integrals always exist.
Suppose we know that $(y_n)_{n\in\mathbb N}$ converges uniformly to a function $y^*$: $$\|y_n - y^*\|\to0,\qquad(n\to\infty)$$ where $\|\bullet\|$ is the supremum norm on $[0,\alpha]$: $$ \| y \| = \sup_{x\in [0,\alpha]} |y(x)|. $$
The question is:
Can we prove that $y^*$ has to be differentiable?
If I'd know that $y_n$ was always differentiable with $y_n'(x) = g\bigl(y_{n-1}(x)\bigr)$, then it would suffice to show that $(y_n')_{n\in\mathbb N}$ converges uniformely, but I don't really know anything about the convergence of $y_n'$, since I don't know anything about $g$? Would it be easier or at least doable if $g$ was continuous?
This question arises, when you try to apply the Picard-iteration to the initial value problem $$ y'(x) = g\bigl(y(x)\bigr),\qquad y(0) = 0, $$ where one does not have Lipschitz-continuity of $g$. There is a similar exercise in the german standard textbook „Gewöhnliche Differentialgleichungen“ by Harro Heuser. It is Exercise III.12.5 in that book (at least in the fourth edition).
Assume $g\in L^\infty(\mathbb{R})$. If $y_n$ is converging in the sup norm, we have that they form a Cauchy sequence in $\mathcal{C}([0,\alpha])$. Thus:
$\lVert \int_0^x g(y_n(t))dt-y_n(x)\rVert\to 0$,
as $n\to 0$. From this:
$\lVert y^* -\int_0^x g(y^*(t))dt\rVert\leq\lVert y^* -y_n\rVert+\lVert y_n -\int_0^x g(y_n(t))dt\rVert+\lVert \int_0^x g(y_n(t))dt -\int_0^x g(y^*(t))dt\rVert$.
The first and second term go to zero by hypothesis and by what we have recalled above. The third term goes to zero by dominated convergence theorem (the trick here is that you are integrating on a bounded, and independently of what you put inside $g$ you can bound it from above with the integrable constant function $\lVert g\rVert$).
Therefore $y^*$ is differentiable almost everywhere since it is absolutely continuous.
From my point of view $g$ should be at least $L^1$, so that the integral makes sense. For $g\in L^1$ it is way harder, and probabily false, to get a uniform $L^1$ bound for $g(y_n)$ on $[0,\alpha]$.
Can you get differentiability everyhere? Yes you can, assuming that $g$ is bounded and avery point in $\mathbb{R}$ is a lebesgue point for $g$.