I had three assignments where I had different sets for which I had to verify if they were invariant with respect to the system. I did not really know how to solve them so I hope people are willing to help me with this.
1) Consider the autonomous linear system $x(k+1) = Ax(k)$ and the set $S_1 = \{ x \in \mathbb{R}^2 : x^TPx \leq 3.245\}$. Herein:
$$ A = \begin{bmatrix} 0.8 & 0.1 \\ 0.2 & -0.5 \end{bmatrix} \quad P = \begin{bmatrix} 3 & 0.075 \\ 0.075 & 1.36 \end{bmatrix}$$
Verify if the set $S_1$ is invariant with respect to the system.
A: No clue where to begin.
2) Consider the autonomous linear system $x(k+1) = Ax(k)$ and the set $S_2 = \{ x \in \mathbb{R} : ||x||_1 \leq 1\}$. $S_3 = \{ x \in \mathbb{R}^3 : ||x||_\infty \leq 1 \}$. Herein:
$$ A = \begin{bmatrix} 0.8 & 0 & 0 \\ -0.2 & -0.5 & 0 \\ 0 & 0.3 & 1 \end{bmatrix}$$
Verify if the set $S_2$ and $S_3$ are invariant with respect to the system.
A: Compute $||A||_1 = 1$ and $||A||_\infty = 1.3$. Use the fact that $||Ax|| \leq ||A||\,||x||$
3) Consider the autonomous linear system $x(k+1) = Ax(k)$ and the set $S_4 = \{x \in \mathbb{R}^2 : x_1 + x_2 \leq 1, x_1 \geq -1, x_2 \geq -1 \}$. Herein:
$$ A = \begin{bmatrix} 0.5 & 0 \\ -0.3 & -0.5 \end{bmatrix}$$
Verify if the set $S_2$ and $S_3$ are invariant with respect to the system.
A: No clue where to begin.
As a rule of thumb, whenever you have a recurrence relation like $x_{k+1} = Ax_k$, it is useful to diagonalize $A$ whenever possible and abuse linearity. Suppose $Av_i = \lambda_i v_i$, $i = 1, 2$ in dimension 2. For any $x_0$, \begin{align*} x_0 &= a_1 v_1 + a_2 v_2 \\ Ax_0 &= A(a_1 v_1 + a_2 v_2) \\ &= a_1 Av_1 + a_2 Av_2 \\ &= \lambda_1 a_1 v_1 + \lambda_2 a_2 v_2 \\ &= x_1. \\ &\vdots \\ x_k &= A^k x_0 \\ &= \lambda_1^k a_1 v_1 + \lambda_2^k a_2 v_2. \end{align*} In other words, once you find the eigenvalues and the coefficients $a_1, a_2$, you don't need to multiply matrices anymore (useful).
So, for question 1), you find that the eigenvalues of $A$ are both $<1$ in absolute value. The important thing to get from this is the geometric intuition: the eigenvalues tell you how the system evolves, and the fact that both eigenvalues are real and in absolute value strictly less than 1 tells you that $Ax$ will be "smaller". That is, if $x \in S_1$, then $Ax \in S_1$ because you've just scaled down each coordinate in the $\{v_1, v_2\}$ basis.
To make this precise, let $x \in S_1$ and write $x = a_1 v_1 + a_2 v_2$, where $v_i$ are eigenvectors of $A$. Then, in particular, you can find $x^T Px$ in terms of $v_i$ and $a_i$: \begin{align*} x^T Px &= (a_1 v_1 + a_2 v_2)^T P(a_1 v_1 + a_2 v_2) \\ &= a_1^2 v_1^T P v_1 + a_1a_2 v_1^T Pv_2 + a_1a_2 v_2^T P v_1 + a_2^2 v_2^T P v_2. \end{align*} The above is known as a quadratic form. The important thing to notice is that $(Ax)^T P(Ax)$ will look exactly as above, except you will have $$a_1^2 \lambda_1^2 v_1^T P v_1 + \lambda_1\lambda_2 a_1 a_2 v_1^T P v_2 + \lambda_1\lambda_2 a_1 a_2 v_2^T P v_1 + \lambda_2^2 a_2^2 v_2^T P v_2.$$ Since the $\lambda_i$ only decrease the norm, this quantity will be less than the first one. Thus, $S_1$ is invariant under the system.