This might (and hopefully will) be a very simple question but I'm quite stumped after doing some research: consider the noise-free discrete-time LTI dynamics given by
$ \boldsymbol{x}(k+1) = \boldsymbol{A} \boldsymbol{x}(k) $
We can easily prove that this system is globally exponentially stable (GES), defined as $\Vert \boldsymbol{x}(k) \Vert \leq c \cdot a^{k} \cdot \Vert \boldsymbol{x}(0) \Vert$ for all $\boldsymbol{x}(0) \in \mathbb{R}^{n}$ with $0 < c$ and $0< a<1$, which implies $\lim_{k \to \infty} \Vert \boldsymbol{x}(k) \Vert = 0$, if and only if $\boldsymbol{A}$ is Schur stable. Now consider instead
$ \boldsymbol{x}(k+1) = \boldsymbol{A} \boldsymbol{x}(k) + \boldsymbol{E} \boldsymbol{w}(k) $
where $\boldsymbol{w} \sim \mathcal{N}(\boldsymbol{0}, \boldsymbol{\Sigma})$ is i.i.d. Gaussian noise. In that case, the system is no longer GES since it doesn't satisfy the given definition: GES only applies to deterministic systems (correct me if I'm wrong). However, I'm quite lost on which other type of stability criterion I should/could use: for example, do any of (the common definitions of) bounded-input, bounded-output (BIBO) stability, input-to-state stability (ISS), $L_{2}$ stability, $L_{\infty}$ stability, mean squared stability (MSS) and almost surely stable (ASS) apply and if so, where can I find a book/paper proof statings these theorems for simple LTI systems? Any help would be much appreciated!
EDIT: To clarify, I am writing a report on formal methods in control and as such, I am looking for any referrals to books/papers where the definitions of (stochastic) stability, such as MSS, for LTI systems with Gaussian noise (or general noise) are clearly stated and constructive proofs of those definitions for certain types of systems are provided.
I have found several references in regards to bounded disturbances and ISS and $L_p$ stability, but this is specifically not what I am after: what if we know the PDF of the noise but its support could be unbounded?
The definition of global exponential stability you consider is only really valid for system with inputs or with vanishing ones. When you have a system with inputs, deterministic or not, we consider other types of stability notions which often imply the GES of the autononous system. In the linear deterministic case, those notions are often equivalent.
For instance, the following statements are equivalent
where we have assumed that the system $(A,B,C)$ is both observable and controllable.
Let us go back now to the stochastic system you consider. The additional difficulty now is the randomness but stability can be addressed using moments of the state.
The first-order moment of the state has been already addressed by Kwin van der Veen.
The second order moments can be seen to obey the expression
$$ S(k+1)=AS(k)A^T+E\Sigma E^T $$ where $S(k)=\mathbb{E}[x(k)x(k)^T]$.
The noise now acts as a constant input there and it is possible to show that this system is GES and converge to its unique equilibrium point $S^*$ if $A$ is Schur stable; i.e. all its eigenvalues lie in the unit disc. The converse is true under some controllability condition.
In particular, the stationary solution is given by
$$ S^* = \sum_{i=0}^\infty A^iE\Sigma E^T (A^T)^i $$
This value defines a ball where the state of the system will belong to in the long-run in a mean-square sense. The size of that ball depends on how noisy your noise is (i.e. how big your covariance is), the input matrix $E$, and the stability properties of your system. If some eigenvalues of $A$ are close to the unit circle, then the sum will converge slowly to zero, which will yield a larger value for $S^*$.
The above analysis also implies that
$$ \mathbb{P}(||x(k)||_2^2\ge c)\le\dfrac{\mathrm{trace}(S(k))}{c} $$
where we used Markov's inequality. That gives an idea of what is happening at the level of the sample-paths.