I have an autonomous system of nonlinear differential equations \begin{align*} \tag{0} x' &= g(x), \end{align*} where $x$ is an $N$ dimensional variable, with $N > 2$, and $g(x)$ an $N \times 1$ vector, with: \begin{align*} g(x)=\left[\begin{array}{c} g_{1}(x) \\ \vdots \\ g_2(x) \end{array}\right] \end{align*} $x$ is in fact $x(t)$, but since the system is autonomous I drop $t$ in what follows. $x'$ is $dx(t)/dt$. I have an isolated equilibrium/fixed point $x^* $ where $g(x^* ) = 0$. Isolated in the sense that there is no other point in the neighborhood of $x^* $ such that $g(x ) = 0$.
$g(x)$ is in fact a very special function, $g(x) = adj A(x) f(x)$, where $f(x)$ is a $N \times 1$ vector and $adj A(x)$ is the adjoint of an $N \times N$ matrix $A(x)$. $x^* $ is a singularity such that $det A(x^* ) = 0$, and it also happen that $f()$ is such that $g(x^* ) = 0$ too.
I want to prove that $0 \leq $ Rank $g'(x^* )$ $\leq 2$. (even though $x$ is of dimension $N$) (Marszalek, Amdeberhan and Riaza, 2005). Where $g'(x^* )$ is the Jacobian of the $N \times 1$ vector $g$ taken at $x^* $. \begin{align*} g(x^* )=\left[\begin{array}{ccc} \dfrac{\partial g_{1}(x^* )}{\partial x_{1}} & \cdots & \dfrac{\partial g_{1}(x^* )}{\partial x_{n}} \\ \vdots & \ddots & \vdots \\ \dfrac{\partial g_{n}(x^* )}{\partial x_{1}} & \cdots & \dfrac{\partial g_{n}(x^* )}{\partial x_{n}} \end{array}\right] \end{align*}
This result may be extremely simple to prove since it's not discussed much in the papers I read, yet I am stuck so far. I would like to prove it in the simplest possible way, by computing the Jacobian and its rank for example.
Some context:
I want to show there is only one strictly increasing solution crossing a specific singularity in dimension N. I have autonomous quasilinear differential equations of the form
\begin{equation}\tag{1}
A(x) x' = f(x),
\end{equation}
with an isolated non critical singular point $x^* $, i.e. $det A(x^* ) = 0$ but $(det A)'(x^* ) \neq 0$ (cf Rabier (1989) for e.g.). Here $x$ is of dimension $N$, $A(x)$ is an $N \times N$ matrix, $f(x)$ is an $N \times 1$ vector.
Moreover, it is not a standard impasse point, because I am in the special case where $f(x^* ) \in Im(A(x^* )$. These points seem to have several different names in the literature: "I-singularity", for Image singularity, since $f(x^* ) \in Im(A(x^* ))$, (Sotomayor and Zhitomirskii, 2001) or Geometric Singularities (Marszalek, Amdeberhan and Riaza, 2005).
Typically, to study the solution of these systems, we use the property $adj A(x) A(x) = det A(x) I$, and we rewrite (1) as
\begin{align*} \tag{2}
det A(x) x' &= adj A(x) f(x) \\
\iff \omega(x) x' &= g(x).
\end{align*}
The geometric singularity property yields that $g(x^ *) = 0$.
Since $\omega(x) = det A(x)$ is only a scalar, we can study the solution to (2) by studying the solutions of a time-transformed 'standard' system of nonlinear equations of the form (sometimes called desingularized field):
\begin{align*} \tag{3}
\tilde{x}' &= g(\tilde{x}).
\end{align*}
Obviously, since $g(x^* ) = 0$, $x^*$ gives an equilibrium point to this nonlinear differential equation.
In my work, for a given initial value, I know (by construction) that there exists a strictly increasing solution to the original quasilinear system (1) crossing the singularity $x^* $. I want to prove that this solution is the unique strictly increasing solution (there may be other non strictly increasing solutions). So I guess it comes down to showing that there exists only one strictly increasing solution to (3).
According to Marszalek, Amdeberhan and Riaza (2005), I think it seems to be true, because "there are at most two trajectories smoothly crossing the singular set in opposite directions" (though I still need to figure out about the transversality conditions in my setup).
I was trying to reproduce the "proofs" in Marszalek, Amdeberhan and Riaza (2005), but there is one initial statement I have not been able to prove yet:
It may be shown that, at a geometric singular point $x^* $ of (1), we have $0 \leq Rank g'(x^* ) \leq 2$.
I do not know how to prove this. I have been trying to show that the Jacobian of $g$ at $x^* $ has only $2$ nonvanishing eigenvalues (of opposite sign).
I guess an essential part is to use that Rank $A(x^* ) = N-1$, thus Rank $adj A(x^* ) = 1$ and then go on with the Jacobian of $g$ using that $g(x) = adj A(x) f(x)$. But it does not yield anything interesting in my attempts. Moreover it does not use the fact that the equilibrium point is isolated yet (i.e. that $(det A)'(x^* ) \neq 0$), but maybe it's not useful information at all here?
By the way, I have seen some very quick arguments about why this is $\leq 2$ in Sotomayor and Zhitomirskii (2001), p. 581. But this paper is very technical for me (germs, normal forms, ideal? etc.) and I do not understand it yet. Isn't there a way to show it purely by "computation", showing that there cannot be more than $2$ nonvanishing eigenvalues?
First, an important point (I think), from the fact that the fixed point $x^* $ is isolated, ()′(∗) $\neq 0$, it comes that $rank A(x^* ) = N-1$, and not less. If it was less, the proof would not work anymore.
Now, notice that $A \times adj(A) = det A \times I$.
Thus, denote $H(x) = A(x) \times g(x) = det A(x) \times f(x)$, this hold for all $x$.
Let us compute the Jacobian of H(x) at $x^* $: \begin{align*} J_H(x^* )=\left[\begin{array}{ccc} \dfrac{\partial det A(x^* )}{\partial x_{1}} f_1(x) & \cdots & \dfrac{\partial det A(x^* )}{\partial x_{n}} f_1(x) \\ \vdots & \ddots & \vdots \\ \dfrac{\partial det A(x^* )}{\partial x_{1}} f_n(x) & \cdots & \dfrac{\partial det A(x^* )}{\partial x_{n}} f_n(x) \end{array}\right] = [\underbrace{\partial det A(x^* )/\partial x}_{N\times1} ] \ \underbrace{f(x)^T}_{1\times N} \end{align*}
Since ()′(∗) $\neq 0$ (not sure this is even necessary? This part is useful to determine that rank A(x^* ) = N-1 and not less, not sure it matters anywhere else?), $rank(J_H(x^* )) = 1$.
Now, if we denote $J_g(x)$ the Jacobian of $g$ at $x$, since $H(x) = A(x) \times g(x)$, and at $x^* $, $g(x^* ) = 0$, we have \begin{align*} J_H (x^* ) = A(x^* ) J_g(x^* ). \end{align*}
Using Sylvester's inequality: \begin{align*} rank(J_H(x^* )) = rank(A(x^* ) J_g(x^* )) \geq rank(A(x^* ) ) + rank(J_g(x^* )) - N. \end{align*} Since $rank A(x^* ) = N-1$, it yields the desired results: \begin{align*} rank J_g(x^* ) \leq 2. \end{align*}
Additional remark:
In fact, from personal computation, I'm not even sure why we do not find $rank J_g(x^* ) = 2$. The null space of $J_H(x^* )$ is spanned by $N-1$ independent vectors $v_i$, such that: $J_H(x^* ) v_i = 0$. Then, $ A(x^* ) J_g(x^* ) v_i = 0$ too.
Now, notice that $rank A(x^* ) = N-1$, so $dim(Ker(A(x^* )) = 1$. Thus there exists $\tilde{v} \neq 0$, such that $A(x^* ) \tilde{v} = 0$. i.e. there exists $\tilde{u} \neq 0$ such that $\tilde{v} = J_g(x^* ) \tilde{u} \neq 0 $. Thus $\tilde{u}$ does not belong to the Null space of $J_g(x^* )$, but it belongs to the Null space of $H(x^* )$. So, one of the dimension of the null space is due to $A$, the $N-2$ others are because of $J_g$. Is something wrong with this reasoning?