Eigenvalues determination (/Determinant computation) of a Block Matrix (not block diagonal), come across in a state space representation.

193 Views Asked by At

Studying the decentralized control of a robotic manipulator, I have to face the problem of finding the eigenvalues of a block matrix with a curious form. Specifically, a proportional-integral-derivative controller is used to generate the $n$ command torques that have to move the robot $n$ joints in such a way that the non-linearities of the dynamic model are compensated.

I do not think it is useful to go into more detail on the specific robotic matter so I will get directly to the point; the overall dynamic behaviour of the closed loop control system is led by a third order differential equation involving the error $e(t) = q_r(t)-q(t) \in \mathbb{R}^n$ between the reference generalized position vector $q_r(t)\in\mathbb{R}^n$ and the robot actual generalized position $q(t)\in\mathbb{R}^n$ and its derivatives/antiderivatives: $$\boldsymbol{\ddot e} + \boldsymbol{K_D} \boldsymbol{\dot e} + \boldsymbol{K_P}\boldsymbol{e}+\boldsymbol{K_I}\int_{-\infty}^t \boldsymbol{e}\mathrm{d}\tau = \boldsymbol{0}$$ with $\boldsymbol{K_D}, \boldsymbol{K_P}, \boldsymbol{K_I} \in \mathbb{R}^{n\times n}$

This equation can be represented in a state space model using a $3n$ state vector $\boldsymbol\xi \in \mathbb{R}^{3n}$ defined in the following way: $$\boldsymbol\xi = \begin{pmatrix} \boldsymbol\xi_1 \\ \boldsymbol\xi_2 \\ \boldsymbol\xi_3 \end{pmatrix} = \begin{pmatrix} \int_{-\infty}^t \boldsymbol{e}\mathrm{d}\tau \\ \boldsymbol e \\ \boldsymbol{\dot e} \end{pmatrix}$$ where $\boldsymbol\xi_1,\boldsymbol\xi_2,\boldsymbol\xi_3 \in \mathbb{R}^n$.

So the state equation will be: $$ \boldsymbol{\dot \xi} = \begin{pmatrix} \boldsymbol{\dot \xi_1} \\ \boldsymbol{\dot \xi_2} \\ \boldsymbol{\dot \xi_3} \end{pmatrix} = \begin{pmatrix} \boldsymbol{O} && \boldsymbol{I} && \boldsymbol{O}\\ \boldsymbol{O} && \boldsymbol{O} && \boldsymbol{I}\\ \boldsymbol{-K_I} && \boldsymbol{-K_P} && \boldsymbol{-K_D} \end{pmatrix} \begin{pmatrix} \boldsymbol\xi_1 \\ \boldsymbol\xi_2 \\ \boldsymbol\xi_3 \end{pmatrix} = \boldsymbol A \boldsymbol \xi$$ where $\boldsymbol A \in \mathbb{R}^{3n \times 3n}$.

Now, in order to study the stability of the system, the eigenvalues of such a block matrix have to be found. Using the definition, the problem is equivalent to the determination of the zeros of the determinant of the matrix $s\boldsymbol I-\boldsymbol A$, using recursively only the rule for the computation of the determinant of a block matrix. $$\det(s\boldsymbol I-\boldsymbol A) = \det\begin{pmatrix} s\boldsymbol I && -\boldsymbol{I} && \boldsymbol{O}\\ \boldsymbol{O} && s\boldsymbol I && -\boldsymbol{I}\\ \boldsymbol{K_I} && \boldsymbol{K_P} && s\boldsymbol I + \boldsymbol{K_D} \end{pmatrix} = $$ $$ = \det\left( s\boldsymbol I\right)\det\left(\begin{pmatrix} s\boldsymbol I && -\boldsymbol{I}\\ \boldsymbol{K_P} && s\boldsymbol I + \boldsymbol{K_D} \end{pmatrix} - \begin{pmatrix} \boldsymbol{O}\\ \boldsymbol{K_I}\\ \end{pmatrix} \begin{pmatrix} s\boldsymbol I \end{pmatrix}^{-1} \begin{pmatrix} -\boldsymbol{I} && \boldsymbol{O}\\ \end{pmatrix} \right) =$$

$$ =s^n \det\left(\begin{pmatrix} s\boldsymbol I && -\boldsymbol{I}\\ \boldsymbol{K_P} && s\boldsymbol I + \boldsymbol{K_D} \end{pmatrix} + \begin{pmatrix} \boldsymbol{O} && \boldsymbol{O}\\ \frac{1}{s} \boldsymbol K_I && \boldsymbol{O} \end{pmatrix} \right) = s^n \det\begin{pmatrix} s\boldsymbol I && -\boldsymbol{I}\\ \boldsymbol K_P + \frac{1}{s} \boldsymbol K_I && s\boldsymbol I + \boldsymbol{K_D} \end{pmatrix} = $$ $$ = s^n \det\left( s\boldsymbol I\right)\det\left( \left(s\boldsymbol I + \boldsymbol{K_D} \right) - \left(\boldsymbol K_P + \frac{1}{s} \boldsymbol K_I \right)\left( s\boldsymbol I \right)^{-1} \left(-\boldsymbol I \right) \right) = $$ $$ = s^{2n}\det\left(s\boldsymbol I + \boldsymbol{K_D} + \frac{1}{s}\boldsymbol K_P + \frac{1}{s^2}\boldsymbol K_I\right) = 0$$

This expression from one hand implies that $0$ should be an eigenvalue, from the other hand it looks like $0$ has to be left out because it would end up in the denominator.

Anyway, my robotic course slides simply state that, by considering the Laplace transform of the equation, namely $$s^2\boldsymbol E(s)+\boldsymbol K_D s\boldsymbol E(s)+\boldsymbol K_P \boldsymbol E(s) + \boldsymbol K_I \frac{\boldsymbol E(s)}{s} = \boldsymbol 0$$ one should immediately recognize that the system poles can be found as the zeros of the determinant of the matrix $$s^3 \boldsymbol I + s^2 \boldsymbol K_D+ \boldsymbol K_P s + \boldsymbol K_I$$ (which is exactly the matrix I have found where a common factor $s^2$ is brought out of the determinant as $s^{2n}$).

Why and how does this equation turn out so simply, without actually computing a determinant? Where is the error in my procedure? Is there a simple or easier way to find the eigenvalues of such a matrix?

By looking carefully at it, one can find a strong similarity between this block matrix and the classical companion matrix which always comes out when a state space representation is created from a transfer function or a differential equation and allows to compute immediately the characteristic polynomial simply by using the elements on the last row, with minus sign, rightwards as coefficient of increasing degree up to $s^n$ (which has no coefficient). Is there by chance an analogous property for this kind of "block companion" matrix?


EDIT:

A. Exactly what I thought, my only puzzlement is whether zero is an eigenvalue (with multiplicity 2n); I perfectly know it is a totally stupid question but this is confusing me.

B. Is what you have just derived the transfer function between the initial state $\boldsymbol \xi (0)$ and the state $\boldsymbol \xi (t)$? I am not sure I got what you mean, so correct me if I'm wrong: as you havesaid, if $\boldsymbol \xi (0) = \boldsymbol 0$, I should obtain $$\det(s \boldsymbol I -\boldsymbol A)\boldsymbol E(s) = \det\left(s^3 \boldsymbol I + s^2 \boldsymbol K_D + s \boldsymbol K_P + \boldsymbol K_I\right) \boldsymbol E(s)= \boldsymbol 0$$ while the equation reported in the slides seems slightly different: $$\left(s^3 \boldsymbol I + s^2 \boldsymbol K_D + s \boldsymbol K_P + \boldsymbol K_I\right)\boldsymbol E(s) = \boldsymbol 0$$ where the third degree polynomial is a matrix and not a scalar as in previous equation. Are they somehow equivalent?

C. Thank you for the feedback, It's always difficult to convince myself that I have actually proved something new.

1

There are 1 best solutions below

1
On BEST ANSWER

A. First of all, your analysis appears correct. From the final derivation you can write $$s^{2n}\det\left(s\mathbf{I}+\mathbf{K_D}+\frac{1}{s}\mathbf{K_P}+\frac{1}{s^2}\mathbf{K_I}\right)=s^{2n}\det\left(\frac{1}{s^2}\mathbf{I}\cdot\left(s^3\mathbf{I}+s^2\mathbf{K_D}+s\mathbf{K_P}+\mathbf{K_I}\right)\right)\\ =s^{2n}\det\left(\frac{1}{s^2}\mathbf{I}\right)\det\left(s^3\mathbf{I}+s^2\mathbf{K_D}+s\mathbf{K_P}+\mathbf{K_I}\right)\\ =s^{2n}\left(\frac{1}{s^2}\right)^n\det\left(s^3\mathbf{I}+s^2\mathbf{K_D}+s\mathbf{K_P}+\mathbf{K_I}\right)\\ =\det\left(s^3\mathbf{I}+s^2\mathbf{K_D}+s\mathbf{K_P}+\mathbf{K_I}\right)$$ which is exactly your slide equation.

B. Since $\mathbf{\dot{\xi}}=\mathbf{A}\mathbf{\xi}$ the Laplace transform yields $$\Xi(s)=(s\mathbf{I}-\mathbf{A})^{-1}\xi(0)=\frac{adj(s\mathbf{I}-\mathbf{A})}{\det(s\mathbf{I}-\mathbf{A})}\xi(0)$$ and for $E(s)$ we have that $$E(s)=[0_{n\times n} \quad \mathbf{I}_n \quad 0_{n\times n}]\Xi(s)=\frac{1}{\det(s\mathbf{I}-\mathbf{A})}N(s)$$ with $N(s)$ a vector of dimension $n$ with elements polynomials of order at most $n-1$ defined by $$N(s):=[0_{n\times n} \quad \mathbf{I}_n \quad 0_{n\times n}]\:adj(s\mathbf{I}-\mathbf{A})\:\xi(0).$$ Thus, $$\det(s\mathbf{I}-\mathbf{A})E(s)=N(s)$$ which is exactly the equation you have written (with $\xi(0)=0$).

C. With respect to the property of the characteristic polynomial of a block companion matrix, well, you have just proved it!