Let be $\lambda \in \mathbb{R}$ and $p$ an real valued continuous function, let be the following ODE:
$\begin{equation*} \left\{ \begin{aligned} & X_{\lambda}(t) = \begin{pmatrix} x_{\lambda}(t) \\ x'_{\lambda}(t) \end{pmatrix} \\ & X'_{\lambda}(t) = A_{\lambda}(t) X_{\lambda}(t) \\ & X_{\lambda}(0) = \begin{pmatrix} 0 \\ 1 \end{pmatrix} \\ & A_{\lambda}(t) = \begin{bmatrix} 0 & 1 \\ p(t) - \lambda & 0 \end{bmatrix} \end{aligned} \right. \end{equation*}$
We well consider $\lVert \cdot \rVert$ as the max norm over vectors and we extend it over matrices as the max over lines of the sum of columns.
Let be $\Phi : (\lambda, t) \in \mathbb{R}^2 \mapsto X_{\lambda}(t)$, we can show that $\Phi$ is continuous (and even differentiable).
If we fix $\mu \in \mathbb{R}$ and $0 = t_0 < t_1 < \ldots < t_k \leq 1$ the zeros of $x_{\mu}$, let us denote:
$A_{\varepsilon} = \{ t \in [0, 1] \mid \exists j \in [[0, k]], \lvert t - t_j \rvert \leq \varepsilon \}$ and $B_{\varepsilon} = \{ t \in [0, 1] \mid \forall j \in [[0, k]], \lvert t - t_j \rvert \geq \varepsilon \}$.
Finally, we would like to show the existence of $\varepsilon, \theta, \eta > 0$ such that:
- $\forall j \in [[0, k - 1]], \lvert t_{j + 1} - t_j \rvert \geq 2\varepsilon$
- If $t_k < 1$, $t_k + \varepsilon \leq 1$
If $\lvert \lambda - \mu \rvert < \theta$, then:
3a. $\forall t \in A_{\varepsilon}, \lvert x'_{\lambda}(t) \rvert \geq \eta$
3b. $\forall t \in B_{\varepsilon}, \lvert x_{\lambda}(t) \rvert \geq \eta$
What I have found so far:
As $\Phi$ is continuous on $(\mu, t_0), \ldots, (\mu, t_k)$, we can have $\alpha$ such that:
$\forall j \in [[0, k]], \forall a \in \mathbb{R}^2, \lVert a - (\mu, t_j) \rVert \leq \alpha \implies \lVert \Phi(a) - \Phi(\mu, t_j) \rVert \leq 1 (*)$
As $\lVert (\lambda, t) - (\mu, t_j) \rVert = \max \{ \lvert \lambda - \mu \rvert, \lvert t - t_j \rvert \}$.
- We can define $\varepsilon \leq \min \left(\alpha, 1 - t_k, \dfrac{1}{2}\min_{j \in [[0, k - 1]]} (t_{j + 1} - t_j)\right)$, which will verify (1) and (2) and will give us $(*)$ in $A_{\varepsilon}$ as long as $\lvert \lambda - \mu \rvert \leq \alpha$.
- We can define $\theta = \dfrac{\alpha}{2}$.
But I don't know how to get the lower bound on $x_{\lambda}$ and $x'_{\lambda}$ depending on how far is $t$ from a zero of $x_{\mu}$.
I know that if $x_{\mu}(t) = 0$, then locally $x'_{\mu}$ is either positive or negative but cannot be null in $t$, thus I was hoping that whenever I am near to a zero of $x_{\mu}$, that is $x_{\lambda}(t)$ is near to zero, then $x'_{\lambda}(t)$ should be far from zero of $\eta$.
As for the 3b condition, it seems like to be: every point $t$ sufficiently far from a zero of $x_{\mu}$ implies that $x_{\lambda}(t)$ is sufficiently far from zero, it seems like a continuity thing.
My whole problem is that: continuity will give me upper bounds, I don't see how I will get lower bounds.