Henri Cartan's Implicit Function Theorem

165 Views Asked by At

I am reading Henri Cartan's "Elementary Theory of Analytic Functions of One or Several Complex Variables". On pp. 64-65, Cartan used a version of Implicit Function Theorem that I have never seen:

p.64

p.65

In the version found in most textbooks, an implicit fuction defined by the equation $f(x,y)=0$, where $f:\mathbb R^{m+n}\to\mathbb R^n$, is expressed locally as a function $y=g(x)$ where $g:\mathbb R^m\to\mathbb R^n$. However, Cartan's version looks totally different and I don't know what's going on. Would anyone please explain Cartan's version as well as his proof of the lemma in the images above? Thank you very much.

1

There are 1 best solutions below

0
On BEST ANSWER

Suppose for the sake of concreteness that $\gamma_1'(t_0)\neq 0$. Now, consider the mapping $\delta:(a,b)\times \Bbb{R}\to\Bbb{R}^2$ defined as \begin{align} \delta(t,u)&:=\gamma(t)+\text{sign}(\gamma_1'(t_0))\cdot (0,u)= (\gamma_1(t), \gamma_2(t)+ \text{sign}(\gamma_1'(t_0))\cdot u) \end{align} Since $\gamma$ is a $C^1$ mapping by assumption, so is $\delta$. Now, the Jacobian matrix is \begin{align} \delta'(t,u)&= \begin{pmatrix} \gamma_1'(t) & 0\\ \gamma_2'(t) & \text{sign}(\gamma_1'(t_0)) \end{pmatrix} \end{align} So, in particular, at the point $(t_0,0)$, we have \begin{align} \delta'(t_0,0)&= \begin{pmatrix} \gamma_1'(t_0)& 0\\ \gamma_2'(t_0)& \text{sign}(\gamma_1'(t_0)) \end{pmatrix} \end{align} which is a matrix with determinant equal to $|\gamma_1'(t_0)|>0$. So, this means $\delta$ satisfies the assumptions of the inverse function theorem (and it is a standard exercise to show this is equivalent to the implicit function theorem). As a result:

  • There is an open neighborhood $U'$ of the point $(t_0,0)$ and an open neighborhood $V'$ of the point $\delta(t_0,0)=\gamma(t_0)=(x_0,y_0)$ such that the restriction $\delta:U'\to V'$ is a $C^1$ diffeomorphism.

It is clear by definition of $\delta$ that $\delta(t,0)=\gamma(t)$. So, point (i) of Cartan has been addressed. Now, if we shrink the open set $U'$ to a small enough open ball $U$, then (by continuity of the Jacobian determinant) by setting $V=\delta[U]$, we have that the further restricted mapping $\delta:U\to V$ is a $C^1$ diffeomorphism such that for all $(t,u)\in U$, we have $\det \delta'(t,u)>0$.


Note by the way that the mapping $\delta$ is not uniquely defined. For example, if $f:\Bbb{R}\to\Bbb{R}$ is any smooth map such that $f'(0)=\text{sign}(\gamma_1'(t_0))$, then the proof above can be applied word for word to the mapping $(t,u)\mapsto \gamma(t)+ (0,f(u))=(\gamma_1(t),\gamma_2(t)+f(u))$. This will of course yield different open sets $U,V$, but that's not relevant. All we care about here is that there exist some such diffeomorphism.

Edit:

Maybe I should have done this from the beginning, but the direct way to do this without any case distinction is to let $\xi\in\Bbb{R}^2$ be a vector such that $\{\gamma'(t_0),\xi\}$ is a basis of $\Bbb{R}^2$ (we can do this since $\gamma'(t_0)\neq 0$). Also, we may choose the vector $\xi$ such that $\det \begin{pmatrix}\gamma'(t_0)&\xi\end{pmatrix}>0$ (for example, $\xi$ could just be the vector $\gamma'(t_0)$ rotated counter clockwise by 90 degrees). Then, we simply consider the mapping $\delta:(a,b)\times \Bbb{R}\to\Bbb{R}^2$ defined as $\delta(t,u)=\gamma(t)+u\cdot\xi$, and apply the inverse function theorem to $\delta$ about the point $(t_0,0)$, and restrict the domain sufficiently to ensure the Jacobian determinant is strictly positive.