Given a linear ODE with constant coefficients, $x'=Ax$, and given a solution $x(t)=e^{tA}x_0$ which has initial condition $x(0)=x_0$ we define that $x_0$ is unstable if $x$ diverges when $t\to\infty$.
If there exists an eigenvalue of $A$ with positive real part, we want to prove that unstable points are dense, that is, that given any $z\in\mathbb{R}^n$ there exists a nearby point $x_0$ such that $\|x_0-z\|<\varepsilon$ and that $x(t)=e^{tA}x_0$ diverges, for arbitrarily small $\varepsilon$.
I can prove denseness around $0$, since if $\lambda$ is the eigenvalue with positive real part, and $v$ is the corresponding eigenvector, then we know $x(t)=e^{tA}v=e^{\lambda t}v$, which diverges since $\lambda$ has positive real part, and has initial condition $x(0)=v$, which we can make as small as we want by scaling it down, and it will remain an eigenvector.
But how to extend this property to any point?