Assume I have a dynamical system where I am only interested in the stability of some of the states, probably because the "unimportant" states are not stable.
An example could be a system where time is defined as a state:
$$ \begin{align} \dot{x}_1 &= f(x_1, x_2) \\ \dot{x}_2 &= 1 \,. \end{align} $$
Clearly, the state $x_2$ is unstable, as it represents time: $x_2\rightarrow \infty$ as $t \rightarrow \infty$. However, I don't care about this state, I am only interested in the stability of $x_1$.
Is there a (general) way how to deal with this situation? I am especially interested in the case when an unstable state does not represent time, but something else that can safely "blow up"?
EDIT:
To give a different example than time:
$$ \begin{align} \dot{z}_1 &= -2 z_1 + \tanh(z_2^2) - 1 \\ \dot{z}_2 &= z_2 \,. \end{align} $$
Clearly, $z_2$ will diverge. However, as $z_2 \rightarrow \pm \infty$, the first equation $\dot{z}_1 = -2 z_1$, which is stable. So, although $z_2$ blows up, $z_1$ is still stable. If I now "dont care" about $z_2$, I am done.
But what can I do if things are not as easy as in this simple example? Is there a way to formally prove stability of the part of the system I am interested in?
As the second equation results in
$$x_2(t)=t+c,$$
we can rewrite the first equation as
$$\dot{x}_1(t)=f(x_1,t+c).$$
Now you could try to study the behaviour of a scalar first order time-variant system.
Edit: An alternative method would be to use differential inequalities. Let us consider
$\dot{x}_1=f(x_1,x_2)$
in which the state $x_2$ is unstabel. Assume we can find bounds for $f(x_1,x_2)$ given by the following double inequality:
$g(x_1)\leq f(x_1,x_2) \leq h(x_1)$.
Then it is possible to bound the derivative of $x_1$ by
$g(x_1)\leq \dot{x}_1 \leq h(x_1)$.
Then use the theorem of Petrovitsch. By this procedure, you could at least bound the solution trajectories of $x_1$.