Convergence speed dictated by spectral radius or norm of matrix?

172 Views Asked by At

I have a system of the type

$x^{t+1} = A x^t + \|x^t\| v^t$

where $A$ is a real matrix, possibly not symmetric, $v^t$ is a real-vector that can be arbitrary and satisfies $\|v^t\| \leq \delta$. The norms are operator-norms.

I want to understand the speed at which $x^t$ convergences to zero (of course under some assumptions). To be concrete, I want to find the smallest $\gamma$ such that $\|x^t\| \leq const \times \gamma^t \times \|x^0\|$.

I can prove that the system converges exponentially fast at a rate controlled by $\|A\| + \delta$ if this is smaller than 1. However, by doing numerical experiments, it seems like the rate should be controlled by something like $\rho(A) + \delta$, if this is smaller than 1, where $\rho(A)$ is the spectral radius of $A$.

This must be really simple, but something is missing.... any ideas?