Consider the differential equation $ y'(x)=f(x,y(x)) $ with initial condition $y(x_0)=y_0$. According to some theorem, if $f$ is continuous, then there exists at least one solution that satisfies the initial condition. If $f $ is also continuously differentiable, then the solution is unique.
So my question is, does $f$ really have to be continuously differentiable? Intuitively I could calculate the values of $y$ by using the approximation formula $$ y(x_0+\Delta x) \approx y'(x_0)\Delta x+y(x_0) =f(x_0, y_0)\Delta x+y_0 $$ for very small $ \Delta x $. Isn't this the idea behind Euler's method? And this technique leads to unique solution, right? And therefore we would not have any requirements for $f $ expect for being defined on every point of some interval.
If $f$ is not continuously differentiable, different choices of $\Delta x$ yield wildly different values and there need be no limit as $\Delta x \rightarrow 0$. See https://math.stackexchange.com/a/3338788/123905 for an illustration of how crinkly (so that nearby derivatives are wildly unrelated, so positive choices of $\Delta x$ tell you nothing about the instantaneous derivative at $x$) a discontinuously differentiable function can be.
The cited answers gives examples of random walks. It should not be surprising that two random walks can have identical initial segments, then wander off never to meet again.