Let $G \subset \mathbb{C}$ and $f:G\rightarrow \mathbb{C}$ holomorphic function so $f' \neq 0$ in $G$. Let $U \subset f(G)$ be a domain and $s,t: U \rightarrow G$ are continuous branches of $f^{-1}$. I need to show that if $s(z_0) = t(z_0)$ for some $z_0 \in U$ then $s=t$.
I tried defining $h=s-t$ which it's derivative in $z_0$ is known to be $0$ since $s'(z_0)=t'(z_0)= \frac{1}{f'(z_0)}$. How to proceed from here?
EDIT
Another attempt involving some calculus.
My idea is to let a $z_1 \in U$ so $s(z_1) \neq t(z_1)$ and define a continuous curve $\gamma :[0,1] \rightarrow U$ that connects $z_0$ to $z_1$.
Let $A = \left\{ r \in [0,1] : s( \gamma (r)) \neq t( \gamma (r)) \right\} $
and let $p = \inf A$ .
Clearly, $p$ can't be the minimum of $A$ since the continuity of $s,t$ makes $A$ an open set. Hence $p \notin A$. So $s( \gamma (p)) = t( \gamma (p))$.
We'll denote $\gamma (p) := a \in U$ and $t(a) := \alpha \in G$.
Now we can address $f$ as to a continuously differentiable map from the real plane to itself, and since it's derivative is invertible we can apply the inverse function theorem to get that there's a unique inverse function of $f$ in a neighborhood $V \subset U$ of $a$ and this uniqueness implies $\forall z \in V: \ s(z)=t(z)$.
By the continuity of $\gamma$, there is a $\delta > 0$ so that for every $r \in (p- \delta , p+ \delta):\ \gamma (r) \in V$ in contradiction of $p$ being the infimum of $A$.
Is this a valid explanation?
As $f(s(z))=z$ we have $$\tag1f'(s(z))s'(z)=1$$ and likewise $f'(t(z))t'(z)=1$ for all $z\in U$. In particular, $s'(z_0)=\frac1{f'(s(z_0))}=\frac1{f'(t(z_0))}=t'(z_0)$.
Taking derivatives of $(1)$ again, we have $$\tag2s'(z)^2f''(s(z))+f'(s(z))s''(z)=0$$ which looks promising: I claim that $n$ application of $\frac d{dz}$ to $(1)$ produce an equation of the form $$\tag3 f'(s(z))s^{(n+1)}(z)+\sum_{k=2}^{\infty}F_{n,k}\bigl(s'(z),s''(z),\ldots,s^{(n)}(z)\bigr)f^{(k)}(z)=0$$ where the $F_{n,k}(X_1,\ldots,X_n)\in \Bbb Z[X_1,\ldots,X_n]$ are integer polynomials in $n$ variables and $F_{n,k}=0$ for almost all $k$. Indeed, $(2)$ shows this for $n=1$ with $F_{1,2}(X_1)=X_1^2$, $F_{1,k}(X_1)=0$ for $k>2$. For a proof by induction, apply $\frac d{dz}$ to $(3)$ to obtain $$\begin{align}0&=\frac d{dz}\left(f'(s(z))s^{(n+1)}(z)\right)+\sum_{k=2}^{\infty}\frac d{dz}\left(F_{n,k}\bigl(s'(z),s''(z),\ldots,s^{(n)}(z)\bigr)f^{(k)}(z)\right)\\ &=f'(s(z))s^{(n+2)}(z)+s'(z)f''(s(z))s^{(n+1)}(z)\\ &\quad+\sum_{k=2}^{\infty}F_{n,k}\bigl(s'(z),s''(z),\ldots,s^{(n)}(z)\bigr)f^{(k+1)}(z)\\ &\quad+\sum_{k=2}^{\infty}f^{(k)}(s(z))\sum_{j=1}^ns^{(j)}(z)\frac\partial{\partial X_j}F_{n,k}\bigl(s'(z),s''(z),\ldots,s^{(n)}(z)\bigr) \end{align}$$ and this is of the desired form. (If you want to be really explicit, verify that we are lead to $$F_{n+1,k}(X_1,\ldots, X_{n+1})=\sum_{j=1}^nX_{j}\frac\partial{\partial X_j}F_{n,k}+\begin{cases}F_{n,k-1}\bigl(X_1,\ldots,X_n\bigr)&\text{if }k>2\\ X_1X_{n+1}&\text{if }k=2\end{cases} $$ Note that by the same reasoning we have $$\tag4 f'(t(z))t^{(n+1)}(z)+\sum_{k=2}^{\infty}F_{n,k}\bigl(t'(z),t''(z),\ldots,t^{(n)}(z)\bigr)f^{(k)}(z)=0.$$ Then for all $n$ we have $s^{(n)}(z_0)=t^{(n)}(z_0)$, as follows by induction from $(3)$ and $(4)$ evaluated at $z_0$. In other words, $s$ and $t$ have the same Taylor series when developed around $z_0$. Then on the domain $U$, we have $s\equiv t$.