In Zeidler's text on functional analysis pg.24 he wrote...
The Picard Lindelöf Theorem: Assume the following:
(a) the function $F: S \to \mathbb{R}$ is continuous and the partial derivative $F_u:S \to \mathbb{R}$ is also continuous
(b) we set $M = \max_{(x,u)\in S} |F(x,u)|$ and $L = \max_{(x,u)\in S} |F_u(x,u)|$ and we choose real number $h$ in such a way that $$0 < h \leq r, hM \leq r, hL < 1$$
Then he goes on by stating that there exist unique solution for IVP of the type $u' = F(x,u)$, $u(x_0) = u_0$ on the square $S$ where $S = \{(x,u) \in \mathbb{R^2}: |x - x_0|\leq r, |u - u_0|\leq r\}, r >0$
My question is where is the usual condition about Lipschitz continuity in the assumption of the theorem?
For example, Wolfram http://mathworld.wolfram.com/PicardsExistenceTheorem.html

I am guessing $L$ must indirectly imply Lipschitz continuity of function $F$, but why is it stated in this particular way?
Where is the Lipschitz continuity? It's hiding in disguise as the bounded derivative $F_u(x, u)$. Since $\Vert F_u(x, u) \Vert \le L$, we can for any $u_1, u_2 \in S$ let $\gamma(t) = (1 - t)u_1 + tu_2$ be the line segment joining them (where $t \in [0, 1]$); we have
$\gamma'(t) = u_2 - u_1; \tag{1}$
then by the vector version of the fundamental theorem of calculus, using ${}^\prime$ to denote the $s$-derivative,
$\Vert F(x, u_2) - F(x, u_1) \Vert = \Vert F(x, \gamma(1)) - F(x, \gamma(0) \Vert$ $= \Vert \int_0^1 (F(x, \gamma(s))' ds \Vert = \Vert \int_0^1 F_u (x, \gamma(s)) \gamma'(s) ds \Vert$ $\le \int_0^1 \Vert F_u(x, \gamma(s)) \Vert \Vert \gamma'(s) \Vert ds = \int_0^1 \Vert F_u(x, \gamma(s)) \Vert \Vert u_2 - u_1 \Vert ds$ $= \Vert u_2 - u_1 \Vert \int_0^1 \Vert F_u(x, \gamma(s)) \Vert ds \le \Vert u_2 - u_1 \Vert \int_0^1 Lds = L\Vert u_2 - u_1 \Vert, \tag{2}$
which shows the Lipschitz continuity of $F(x, u)$. The point is that functions having bounded derivatives are always Lipschitz continuous.
So our OP Illegal Immigrant is right, the existence of the bound $L$ on $F_u(x, u)$ implies Lipschitz continuity. I think it's stated as such for the reason the differentiable functions are generally easier to handle than merely continuous ones.