So I am studying a solved example we did in control systems class but I struggle a little bit with the linear algebra involved:
We are given a system (S), $x \in R^n and A: n \times n$ that:
$ (S) = \begin{cases}
xA=Ax \\
xB=0\\
\end{cases}$
has only {one solution} : $x=0$
And we are asked to show that $(A,B)$ is controllable. (For anyone who doesn't know control theory: in simple terms we need to show that $rank(C)=n$ where $C=[B$ $ AB ... A^{n-1}B]$ "the controllabillity matrix".)
Solution)
We calculate : $xC = [xB$ $xA^2B$ $....$ $xA^{n-1}B]$ so it is a matrix with terms : $xA^{j-1}B$ with $j \in [0,n]$. If we take a closer look:
$xA^{j-1}B = AxA^{j-2}B= A^2xA^{j-3}B= .. = A^{j-1}xB = 0 $
The part I am getting confused is when he said that:
Let's suppose that $rank(C)= r<n$ , then there are $(n-r)$ $u_i$ , $i \in [1,n-r]$ vectors $\in R^n$ vertical to every column of $C$. If I create, a matrix $x$ that it's rows are these vectors: $x = \Big[\begin{cases} u_1^T\\
u_2^T\\
. \\ . \\ . \\
U_{n-r}^T\\
\end{cases}$
then we reach a non- true conculsion therefore $rank(C)=n$.
I need some more insight into how he concluded that:
- How he concuded that "then there are $(n-r)$ $u_i$ , $i \in [1,n-r]$ vectors $\in R^n$ vertical to every column of $C$" ?
I thought that if $rank(C)=r<n$ then there must be $(n-r)$ linearly dependent columns( vectors) in $C$, which means that we can find a set of coefficients $a_i \neq 0$ such that $\sum_{i=1}^n ai*ci = 0$ where $c_i$ an element of $C$.
Is it a theorem that he uses? If anyone can suggest me what to study..? - Ok, let's say I understand 1, then we have built a matrix x with these vertical vectors such that $xC=0$ so he says that: $x \neq 0$ and $x$ satisfies the $(S)$ so we reached that the sytems does not have singular solution which isn't true?