When talking about the Lyapunov stability of a dynamical system, we usually take some point in the domain to test its stability. For example, for the ODE $\dot{x} = -x$, the origin $ x = 0 $ is asymptotically stable. But, I saw a paper about the stability of a dynamical system where the stability is defined for a set rather than a point. They defined the distance $$ {\rm{dist}}(x,A) = \inf_{y \in A}{\Vert x - y \Vert} $$ and also defined that a closed set $ A $ is asymptotically stable if there exists a class $ \mathcal{KL} $ function $ \beta $ such that for the solution $ x(t) $,
$$ \forall t \ge t_{0}:~{\rm{dist}}(x(t),A) \le \beta({\rm{dist}}(x_{0},A), t - t_{0} ) $$
where $ t_{0} $ and $ x_{0} $ are the initial time and state, respectively. I think it is a quite reasonable generalization of stability.
I wonder if there are good references to such a definition. Is this a common definition of stability of a dynamical system? If so, I want some representative references (textbook or paper) introducing it.
I found some useful references for this topic: the stability of a closed invariant set, or simply, the set stability. The original paper where I encountered the definition is
Reduction Principles and the Stabilization of Closed Sets for Passive Systems (2010) by El-Hawwary
and the set stability and attractivity are defined in it. The definition in OP is my reformulation of the definition in the paper, but honestly, I didn't verify that both are equivalent.
The references for the definition I found so far include
You may think the sources are a bit outdated, but I had no difficulty understanding notations and they are actually very well-written books. Yoshizawa's book is the one that I recommend reading.