I am rather stuck on an exercise concerning the companion/controllability matrix (the exercise stems from a course in control theory).
Given the companion matrix \begin{equation} A=\left(\begin{array}{cccccc} -a_1 & -a_2 & -a_3 & ... & -a_{n-1}&-a_n\\ 1 & 0 & 0 & ... & 0 & 0 \\ 0 & 1 &0 & ... & 0 &0 \\ \vdots & \vdots & \vdots & \vdots & \vdots & \vdots \\ 0 & 0 &0 & ... & 1 &0 \end{array} \right). \end{equation} to the equation $a(s)=\sum_{i=0}^n a_is^{n-i}$. Show for another polynomial
$$g(s)=\sum_{i=0}^n g_is^{n-i}$$
that
$$g(A)=((A^{n-1})'c'\: ...\: A'c' \text{ } c')'$$ where $c'=(g_1, ..., g_n)'$ such that $c'$ is a column vector. Note that I am using the somewhat strange convention of $'$ as tranpose.
EDIT: I forgot to write, $a_0=1, g_0=0$.
I've been playing around a bit with the inverse $g(A)^{-1}$ but am not sure it exists and have also been trying to factor out the $c'$ but to no avail.
Any suggestion or help otherwise is appreciated.
Thanks in advance.
Presumably, we have $a_0 = 1$, and $g_0 = 0$. Otherwise, I don't see how this can make sense. Also, I assume you mean $g(A')$ rather than $g(A)$.
First, prove that whatever the answer is should be a linear function of $c'$. That is, $g(A')$ is a linear function of the coefficients of $g$.
Now, it suffices to prove that this holds on a basis. Let the vectors $e_j$ denote the canonical basis (so, for example, $e_1 = (1,0,\dots,0)'$). It's enough to prove that the formula holds when $c' = e_j$, which is to say that $g(s) = s^{n-j}$. That is, we need to show that for each $j$, $$ (A^{n-j})' = \pmatrix{(A^{n-1})' e_{j} & (A^{n-2})' e_{j} & \cdots & e_{j}} $$
Now, we need to important properties of $A$:
It is notable (but not of use to us in itself) that $a(A) = 0$.
To finish the proof, we need to show that the identity holds for $j=n-i$ for each of $i = 1,\dots,n-1$ (property 2 was the identity for $i=0$). That is, we need to show the identity holds for $g(s) = s^i$. That is, we need to show $$ (A^i)' = \pmatrix{(A^{n-1})' e_{n-i} & (A^{n-2})' e_{n-i} & \cdots & e_{n-i}} $$ For each of $i = 0,\dots,n-1$, noting that we've already shown it for $i=0$. We can prove that this holds using induction as follows: $$ \begin{align} (A^{i+1})' &= A' (A^i )' \\ & = A' \pmatrix{(A^{n-1})' e_{n-i} & (A^{n-2})' e_{n-i} & \cdots & e_{n-i}} \\ & = \pmatrix{(A^{n})' e_{n-i} & (A^{n-1})' e_{n-i} & \cdots & A' e_{n-i}} \\ & = \pmatrix{(A^{n-1})' (A'e_{n-i}) & (A^{n-2})' (A'e_{n-i}) & \cdots & (A' e_{n-i})} \\ & = \pmatrix{(A^{n-1})' e_{n-(i+1)} & (A^{n-2})' e_{n-(i+1)} & \cdots & e_{n-(i+1)}} \end{align} $$ The conclusion follows.