I have a rather peculiar question I hope you can help me with. Assume I have some sort of parameter optimizer (not a linear one) which tries to find a local minimum of a likelihood function.
In most cases this optimizer can identify a local optimum from which it can no longer 'escape', yielding what I would call a 'stable solution' (meaning that the optimizer could, in theory, stop updating the parameters without compromising the quality of the solution).
In some rare cases, however, this optimizer identifies a solution which depends on regular changes to the parameters (it is not strictly an oscillation - the parameter changes aren't regular enough for that). This solution constitutes something that behaves like a local optimum, but in fact relies on constant parameter updates. Stopping the optimizer would reveal this phenomenon.
What do you call a system like the latter? It is not truly stable (since it requires constant parameter updates), but I also wouldn't call it unstable (since the system can in theory remain indefinitely in such a state).