Studying some control theory but having difficulty learning because my lecturer doesn't provide solutions to any of his exercises AT ALL. Below I've attached a problem I've just done and my answers and would like to know if I'm on the correct track.


For those who can't view the images:
Questions
- a) Consider the system
$$ \dot{x}=Ax+Bu, x(0)=x_0 \in \mathbb{R}^n ,$$
with unrestricted control u: [0,T] $ \rightarrow \mathbb{R}^m, A \in \mathbb{R}^{n \times n}$ and $B \in \mathbb{R}^{n\times m} $.
(i) Define what it means for this system to be stabilizable.
(ii) State at theorem giving a sufficient condition on the matrices A,B for this system to be stabilizable.
(iii) Show that the system $$ \dot{x_1}=x_2 +u, \dot{x_2}=x_1 $$ where $ x_1, x_2, $ and u are real valued functions, is stabilizable.
answers
(i) Consider a closed loop feedback control of the form u=kx. Then the system is stabilizable if there exists a matrix $ K \in \mathbb{R}^{m \times n} $ such that the system
$$ \dot{x} =(A+BK)x, x(0)=x_0 \in \mathbb{R}^n $$
is asymptotically stable.
(ii) If (A,B) is controllable, then it is stabilizable.
(iii) (A,B) is controllable if and only if rank(G)=n where G is the controllability matrix $ G=(B,AB,...,A^{n-1}B) \in \mathbb{R}^{n \times nm} $.
For this particular system one has $ A=\begin{bmatrix} 0 & 1 \\1 & 0 \end{bmatrix} B= \begin{bmatrix} 1 & 0 \end{bmatrix} $ transposed
Giving G (not writing out calculation because I'm slow at latex) $=\begin{bmatrix} 1 & 0 \\0 & 1 \end{bmatrix} $.
Clearly (1,0) and (0,1) are linearly independant so rank(G)=2=n and hence (A,B) is controllable, and so is also stabilizable.
^Is this correct? Thanks.