How can I express a linear matrix inequality in an expanded form?

132 Views Asked by At

In the paper Kalman filtering with intermittent observations by Sinopoli et al., I found the following linear matrix inequality (LMI) $$ \begin{bmatrix}X - (1-\lambda)AXA^T & \sqrt{\lambda}F \\ \sqrt{\lambda}F^T & X^{-1}\end{bmatrix} > 0 \label{eq1} \tag{1} $$ where $X - (1-\lambda)AXA^T > 0,F < 0,$ and $\lambda \in [0,1]$. The author of the paper then writes

Using one more time the Schur complement decomposition on the first element of the matrix we obtain $$ \begin{bmatrix}X & \sqrt{\lambda}F & \sqrt{1-\lambda}A \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ \sqrt{1-\lambda}A^T & 0 & X^{-1}\end{bmatrix} > 0 \label{eq2} \tag{2} $$

However, I'm not sure how the author went from the LMI in \eqref{eq1} to the LMI in \eqref{eq2}. For reference, the "Schur complement decomposition" referred to by the author is the following theorem:

For any symmetric matrix $M$ of the form $$ M = \begin{bmatrix}A & B \\ B^T & C\end{bmatrix} $$ if $C$ is invertible, then $M > 0$ if and only if $C > 0$ and $A - BC^{-1}B^T > 0$.


Update

Based on @Park's answer, since both $X$ and $X - (1-\lambda)AXA^T$ are positive definite, then $$\begin{pmatrix} X-(1-\lambda)AXA^{t} &\sqrt[]{\lambda}F & 0 \\ \sqrt[]{\lambda}F^{t} & X^{-1} & 0 \\ 0 & 0 & X^{-1} \\ \end{pmatrix}$$ is also positive definite, as this is a block-diagonal matrix, where the upper-left block and the $X^{-1}$ in the bottom-right block both have positive eigenvalues. However, it seems that @Park's answer relies on the following congruence transformation: $$Q\begin{pmatrix} X-(1-\lambda)AXA^{t} &\sqrt[]{\lambda}F & 0 \\ \sqrt[]{\lambda}F^{t} & X^{-1} & 0 \\ 0 & 0 & X^{-1} \\ \end{pmatrix}Q^T = \begin{pmatrix}X & \sqrt{\lambda}F & \sqrt{1-\lambda}A \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ \sqrt{1-\lambda}A^T & 0 & X^{-1}\end{pmatrix}$$ to show that $$\begin{pmatrix}X & \sqrt{\lambda}F & \sqrt{1-\lambda}A \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ \sqrt{1-\lambda}A^T & 0 & X^{-1}\end{pmatrix} > 0$$ I'm not sure what the block-matrix $Q$ should be, so would appreciate some direction.

3

There are 3 best solutions below

1
On BEST ANSWER

(1) is written as $$\begin{bmatrix}X & \sqrt{\lambda}F \\ \sqrt{\lambda}F^T & X^{-1}\end{bmatrix} - \begin{bmatrix} (1-\lambda)AXA^T & 0 \\ 0 & 0\end{bmatrix} > 0$$ or $$\begin{bmatrix}X & \sqrt{\lambda}F \\ \sqrt{\lambda}F^T & X^{-1}\end{bmatrix} - \begin{bmatrix} \sqrt{1-\lambda}A \\ 0 \end{bmatrix}X[\sqrt{1-\lambda}\,A^T \quad 0] > 0\tag{3}$$ which, using Schur complement, is equivalent to $$\begin{bmatrix}X & \sqrt{\lambda}F & \sqrt{1-\lambda}A \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ \sqrt{1-\lambda}A^T & 0 & X^{-1}\end{bmatrix} > 0.$$ (Note: In (3), let $A = \mbox{first term of LHS}$, $B^T = [\sqrt{1-\lambda}\,A^T \quad 0]$ and $C = X^{-1}$ and apply Schur complement.)

6
On

$$\begin{pmatrix} X-(1-\lambda)AXA^{t} &\sqrt[]{\lambda}F \\ \sqrt[]{\lambda}F^{t} & X^{-1} \\ \end{pmatrix}>0\Leftrightarrow \begin{pmatrix} X-(1-\lambda)AXA^{t} &\sqrt[]{\lambda}F & 0 \\ \sqrt[]{\lambda}F^{t} & X^{-1} & 0 \\ 0 & 0 & X^{-1} \\ \end{pmatrix}>0\Leftrightarrow\begin{pmatrix} X &\sqrt[]{\lambda}F & \sqrt[]{1-\lambda}A \\ \sqrt[]{\lambda}F^{t} & X^{-1} & 0 \\ \sqrt[]{1-\lambda}A^{t} & 0 & X^{-1} \\ \end{pmatrix}>0$$

0
On

This is an expanded version of @Park's answer with some commentary.

To go from the block matrix $$W = \begin{bmatrix}X - (1-\lambda)AXA^T & \sqrt{\lambda}F \\ \sqrt{\lambda}F^T & X^{-1}\end{bmatrix}$$ to the block matrix $$V = \begin{bmatrix}X & \sqrt{\lambda}F & \sqrt{1-\lambda}A \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ \sqrt{1-\lambda}A^T & 0 & X^{-1}\end{bmatrix}$$ We proceed in reverse. That is, we start from $V$ and simplify to $W$. To do so, we first go from $V$ to the following block matrix $Y$ via column-reduction and row-reduction: $$ Y = \begin{bmatrix}X - (1-\lambda)AXA^T & \sqrt{\lambda}F & 0 \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ 0 & 0 & X^{-1}\end{bmatrix} $$ First, we eliminate the matrix $\sqrt{1-\lambda}A^T$ in the bottom-left of $V$ via column reduction as follows: $$ \begin{bmatrix}X & \sqrt{\lambda}F & \sqrt{1-\lambda}A \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ \sqrt{1-\lambda}A^T & 0 & X^{-1}\end{bmatrix}\begin{bmatrix}I & 0 & 0 \\ 0 & I & 0 \\ -\sqrt{1-\lambda}XA^T & 0 & I\end{bmatrix} = \begin{bmatrix}X - (1-\lambda)AXA^T & \sqrt{\lambda}F & \sqrt{1-\lambda}A \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ 0 & 0 & X^{-1}\end{bmatrix} $$ We then eliminate the matrix $\sqrt{1-\lambda}A$ in the top-right position of the block matrix above using row reduction as follows: $$ \begin{align} \begin{bmatrix}I & 0 & -\sqrt{1-\lambda}AX \\ 0 & I & 0 \\ 0 & 0 & I\end{bmatrix} \begin{bmatrix}X - (1-\lambda)AXA^T & \sqrt{\lambda}F & \sqrt{1-\lambda}A \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ 0 & 0 & X^{-1}\end{bmatrix} &= \begin{bmatrix}X - (1-\lambda)AXA^T & \sqrt{\lambda}F & 0 \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ 0 & 0 & X^{-1}\end{bmatrix} \\ &= Y \end{align} $$ Therefore, $$ \begin{align} \begin{bmatrix}I & 0 & -\sqrt{1-\lambda}AX \\ 0 & I & 0 \\ 0 & 0 & I\end{bmatrix}\begin{bmatrix}X & \sqrt{\lambda}F & \sqrt{1-\lambda}A \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ \sqrt{1-\lambda}A^T & 0 & X^{-1}\end{bmatrix}\begin{bmatrix}I & 0 & 0 \\ 0 & I & 0 \\ -\sqrt{1-\lambda}XA^T & 0 & I\end{bmatrix} &= \begin{bmatrix}X - (1-\lambda)AXA^T & \sqrt{\lambda}F & 0 \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ 0 & 0 & X^{-1}\end{bmatrix} \\ QVQ^T &= Y \end{align} $$ where $$ Q = \begin{bmatrix}I & 0 & -\sqrt{1-\lambda}AX \\ 0 & I & 0 \\ 0 & 0 & I\end{bmatrix} $$ Note that, because $V$ is congruent to $Y$ with the congruence matrix $Q$, then both $V$ and $Y$ share the same definiteness, such that $V > 0 \iff Y > 0$.

Finally, because $Y$ is a block-diagonal matrix, then the following linear matrix inequality: \begin{align} Y &> 0 \\ \begin{bmatrix}X - (1-\lambda)AXA^T & \sqrt{\lambda}F & 0 \\ \sqrt{\lambda}F^T & X^{-1} & 0 \\ 0 & 0 & X^{-1}\end{bmatrix} &> 0 \end{align} can be decomposed into the following two separate linear matrix inequalities: \begin{align} \begin{bmatrix}X - (1-\lambda)AXA^T & \sqrt{\lambda}F \\ \sqrt{\lambda}F^T & X^{-1}\end{bmatrix} &> 0 \\ X^{-1} &> 0 \end{align} which implies that $W > 0$ if $V > 0$. To prove the converse, we can proceed through the above steps in reverse.


More generally, we have learned from the derivation above that, given the set of LMI's \begin{align} A_1 &> 0 \\ A_2 &> 0 \\ \vdots \\ A_N &> 0 \end{align} We can re-arrange these into the following block-diagonal matrix, which also satisfies the same LMI \begin{align} \begin{bmatrix}A_1 & 0 & \cdots & 0 \\ 0 & A_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & A_N \end{bmatrix} > 0 \end{align} Then, because congruent matrices have the same definiteness, then we can choose any invertible block matrix $Q$ such that \begin{align} Q\begin{bmatrix}A_1 & 0 & \cdots & 0 \\ 0 & A_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & A_N \end{bmatrix}Q^T > 0 \end{align}