Example: The definite integral of a singular matrix need not be singular

269 Views Asked by At

I was reading a book on Linear Control Systems by Prof. Roger Brockett (1970, Wiley).

At the end of Section 1.1, Prof. Brockett asks:

Suppose $K(t)$ is singular for all $t$. Then is $$ \int\limits_0^T \ K(t) \, dt \ \ \ \mbox{singular}? $$

Brockett (1970) gives a hint to consider a matrix defined by an outer product $$ K(t) = \left[ \begin{array}{c} \sin t \\ \cos t \end{array} \right] \ \left[ \begin{array}{cc} \sin t & \cos t \\ \end{array} \right] $$ and calculate $\int\limits_0^{2 \pi} \ K(t) dt$.

I calculated and found $$ K(t) = \left[ \begin{array}{cc} \sin^2 t & \sin t \cos t\\ \sin t \cos t & \cos^2 t \end{array} \right] $$

Obviously, $\mbox{det}[K(t)] = 0$ for all $t \in \mathbf{R}$.

Thus, $K(t)$ is singular for all values of $t$.

Moreover, $$ I = \int\limits_0^{2 \pi} \ \left[ \begin{array}{cc} \sin^2 t & \sin t \cos t\\ \sin t \cos t & \cos^2 t \end{array} \right] \ dt = {1 \over 2} \int\limits_0^{2 \pi} \ \left[ \begin{array}{cc} 1 - \cos 2 t & \sin 2 t \\ \sin 2 t \cos t & 1 + \cos 2 t \end{array} \right] \ dt $$

Integrating, we get $$ I = {1 \over 2} \left[ \begin{array}{cc} t - {\sin 2 t \over 2} & - {\cos 2 t \over 2} \\[2mm] - {\cos 2 t \over 2} & t + {\sin 2 t \over 2} \end{array} \right]_0^{2 \pi} = \left[ \begin{array}{cc} \pi & 0 \\ 0 & \pi \end{array} \right] $$

Clearly, $\mbox{det}(I) = \pi^2 \neq 0$.

Thus, the definite integral of a singular matrix need not be singular.

I hope that the calculations (example) are correct. Any other simple example?

Is there any control theoretic interpretation for this example? (Exercise problem)

3

There are 3 best solutions below

0
On BEST ANSWER

Here another example:
$K(t)= \left( {\begin{array}{*{20}c} {e^{2t} } & {te^t } \\ {te^t } & {t^2 } \\ \end{array}} \right) $

Of course it is singular for every $t$ but its integral over $[0,2]$ is

$ I = \left[ {\left( {\begin{array}{*{20}c} {\frac{{e^{2t} }}{2}} & {e^t \left( {t - 1} \right)} \\ {e^t \left( {t - 1} \right)} & {\frac{{t^3 }}{3}} \\ \end{array}} \right)} \right]_0^2 = \frac{{e^4 }}{3} - 2e^2 - \frac{7}{3} $

0
On

$K(t)=\pmatrix{1\\ t}\pmatrix{1&t}=\pmatrix{1&t\\ t&t^2}$ is singular but $\int_0^TK(t)dt=\pmatrix{T&\frac{T^2}{2}\\ \frac{T^2}{2}&\frac{T^3}{3}}$ is not.

Remark. The OP's question inspires the following question. Suppose $K$ is continuous on $[0,T]$ and $\int_JK(t)dt$ is singular for every interval $J\subseteq[0,T]$. Do all $K(t)$s necessarily share a common left or right eigenvector for the zero eigenvalue?

It turns out that the answer is “no” when the matrices are at least $3\times3$. For a counterexample, let $x$ and $y$ be two mutually orthogonal nonzero real vectors and consider $$ K(t)=\pmatrix{tI&x\\ y^T&0}. $$

Interestingly, the answer is “yes” when the matrices are $2\times2$. Let $$ K(t)=\pmatrix{\alpha(t)&\beta(t)\\ \gamma(t)&\delta(t)}. $$ If $K(t_0)\ne0$ for some $t_0$, by considering $[0,t_0]$ and $[t_0,T]$ separately and by performing a change of variable if necessary, we may assume that $K(0)\ne0$. Furthermore, by replacing $K(t)$ by $UK(t)V$ for some appropriate constant nonsingular matrices $U$ and $V$, we may further assume that $K(0)=\operatorname{diag}(1,0)$, i.e., $\alpha(0)=1$ and $\beta(0)=\gamma(0)=\delta(0)=0$.

By assumption, $\det\int_s^tK(t)=0$ for all $s,t\in[0,T]$. Hence \begin{align} &\left(\int_0^s\alpha(t)dt+\int_s^t\alpha(t)dt\right) \left(\int_0^s\delta(t)dt+\int_s^t\delta(t)dt\right)\\ =\,&\left(\int_0^s\beta(t)dt+\int_s^t\beta(t)dt\right) \left(\int_0^s\gamma(t)dt+\int_s^t\gamma(t)dt\right) \end{align} and in turn, \begin{align} &\left(\int_0^s\alpha(t)dt\right)\left(\int_s^t\delta(t)dt\right) +\left(\int_s^t\alpha(t)dt\right)\left(\int_0^s\delta(t)dt\right)\\ =\,&\left(\int_0^s\beta(t)dt\right)\left(\int_s^t\gamma(t)dt\right) +\left(\int_s^t\beta(t)dt\right)\left(\int_0^s\gamma(t)dt\right). \end{align} Differentiate both sides with respect to $t$, we obtain $$ \delta(t)\int_0^s\alpha(t)dt +\alpha(t)\int_0^s\delta(t)dt =\gamma(t)\int_0^s\beta(t)dt +\beta(t)\int_0^s\gamma(t)dt.\tag{1} $$ Put $t=0$, the above becomes $\int_0^s\delta(t)dt$ for all $s$. Therefore $\delta=0$. But then $(1)$ reduces to $$ \gamma(t)\int_0^s\beta(t)dt +\beta(t)\int_0^s\gamma(t)dt=0. $$ Differentiate both sides with respect to $s$, we get $$ \gamma(t)\beta(s)+\beta(t)\gamma(s)=0.\tag{2} $$ In particular, when $t=s$, we have $\beta(s)\gamma(s)=0$. So, if $\beta(s)\ne0$ for some $s$, then $\gamma(s)=0$ and $(2)$ implies that $\gamma=0$. Similarly, if $\gamma(s)\ne0$ for some $s$, then $\beta=0$. Hence one of $\beta$ or $\gamma$ is the zero function. Since $\delta$ is also zero, we see that the $K(t)$s must share a common left eigenvector for the zero eigenvalue when $\gamma=\delta=0$, or they share a common right eigenvector for the zero eigenvalue when $\beta=\delta=0$.

0
On

There's no need to go into any complicated examples. Just consider that an integral can express any sum. (Arguably, an integral is a sum, by construction.) Now consider, for an arbitrary vector space $V$ with basis $(e_i)_i$, the projection operators $K_i$ onto each basis element. Except for the case of a one-dimensional space, these will be singular. But $\sum_i K_i = \mathrm{id}_V$, which has of course full rank.

Brockett's example is actually a variation on this idea: his $K$ parameterizes a unit vector by rotating it around the origin, and specifically $K(0)$ and $K(\tfrac\pi2)$ are the Euclidean unit vectors. Then the integration involves some interpolation between these vectors, but because everything is linear in the end all this just boils down to summing them together, as well as some scaling for interval length which is why you get $\pi\cdot\mathrm{id}$ as the result.