I have an ODE of a reasonably simple form: $$ -x(y')^2 + 2yy' + x = 0 \qquad \qquad (\ast) $$ for $y$ only assumed to be a differentiable function of $x$ (note: not necessarily $C^1$!) The proof I saw took another derivative and then used some reduction of order to get something separable. I found the solution to $(\ast)$ with my regularity assumption through doing a substitution, getting a separable DE, solving it, etc. And even then, I already knew the form of the solution, so I wasn't particularly surprised. But someone pointed out that I didn't need to actually grind out the solution, because I could check the pretty-much-known solution satisfied the DE, and then with suitable initial conditions (also known) a uniqueness theorem takes over.
However, it's not clear to me what theorem to use. In principle, I could make the substitution and then apply the Picard–Lindelöf theorem to get the uniqueness for the new ODE, but the substitution still feels like inspired magic (it kinda is: I just tried something at random, and was surprised when it worked). Using uniqueness also feels like magic, but of a different sort. At the very least, the solution is well-motivated. So
What uniqueness result for ODEs can I apply to $(\ast)$ directly?
I'm not sure if this is what you are looking for, but we can use quadratic formula to solve for $y'$
$$y' = \frac{y\pm\sqrt{y^2+x^2}}{x} \equiv f(y,x)$$
which does not allow for $x=0$ to be in the domain. Then taking the derivative of the right hand side w.r.t. $y$
$$\partial_y f (y,x) = \frac{1}{x} \pm \frac{y}{x\sqrt{y^2+x^2}}$$
This derivative is always bounded w.r.t. $y$ for both solutions (take the interval of existence to be some predetermined neighborhood away from $0$ and this bound is even uniform), so $f$ is Lipschitz in $y$, therefore a unique solution for the problem with prescribed initial conditions exists.