Is it possible to use the implicit function theorem to prove the existence of ordinary differential equations?
I have seen a proof for the existence of ordinary differential equations in which the conditions are that:
We consider an initial value problem where a function and it's partial with respect to y are continuous.
However these are two of the conditions for the implicit function to work. So I feel like there should be a way to connect the two.
It is possible to use the implicit function theorem to prove uniqueness of solutions of equations of certain type. The connection is more transparent for systems of the form $$ \frac{dx}{dt}=f(x,y),\qquad \frac{dy}{dt}=g(x,y) \tag{1}$$ which include the standard ODE $y'=g(t,y)$ as a special case (let $f\equiv 1$, so that $x\equiv t$).
Suppose there is a function $\psi$ of $x,y$ such that $\nabla \psi = (-g,f)$. Then (1) implies $$\frac{d}{dt}\psi(x(t),y(t)) = 0$$ and therefore every trajectory of (1) is contained in a level set of $\psi$. If the Implicit Function theorem applies to $\psi$, i.e., if it's $C^1$ smooth with nonzero gradient, then the level sets are smooth curves. In particular, they do not branch, and this implies uniqueness for the solution of (1).
The existence of $\psi$ (locally) is equivalent to the equality $-g_y=f_x$, which simply says that $(f,g)$ is a field of zero divergence. So, (1) has a unique solution (for given initial values) under this condition.
One can generalize by observing that it suffices to have $\nabla \psi = (-g,f) h$ with some scalar function $h$.
Closely related: Bendixson–Dulac theorem.