Is $ \left\{ A P + P A^{T} \mid P \succ 0 \right\} $ open?

142 Views Asked by At

Consider an $n$ by $n$ matrix $A$, and the set $$ \left\{ A P + P A^{T} \mid P \succ 0 \right\} $$ Is this set open under the standard metric (the one induced by Frobenius norm) in the space of all symmetric matrices $S^n$?


I think this is true if and only if $A$ is invertible. I think if we assume $A$ is invertible, very roughly speaking, a small perturbation $\epsilon$ in the value of $AP +PA^{T}$ can be obtained by $AP$ and $PA^{T}$ each perturbate at most $\epsilon$ amount. Since $A$ is invertible, $P$ can also perturbate small enough with the choice of $\epsilon$. Since the eigenvalues change continuously, we can change a tiny amount so that all eigenvalues of $P$ remain positive.

I am not sure, however, if $A$ being invertible is required for the set to be open, and I'm also having a little trouble making my argument precise. Also, in general, for

$$\left\{\begin{bmatrix} A_{1}P+PA_{1}^{T} \\ \vdots \\ A_{m}P + P A_{m}^{T} \end{bmatrix} : P \succ 0\right\}$$

Under the standard metric, i.e., sum the difference between each entry square root. When is this open in the space of vectors of m symmetric matrices?

I think if each $A_{i}P+PA_{i}^{T}$ is open, in the product space equipped with this standard product metric, the set is open as well. However, I'm not sure about this or whether there exists open sets such that not every A is invertible.

2

There are 2 best solutions below

7
On

Suppose $f_A:\mathbb{R}^{n\times n}\rightarrow \mathbb{R}^{n\times n}$ denotes the map $P\mapsto AP+PA^T$. It should be clear that $f_A$ is linear and continuous. Moreover, if $A$ is invertible, then $f_A$ is a homeomorphism with continuous inverse, $f_{A^{-1}}$. Hence, $f(\mathrm{PD}(n))$ is open, where $\mathrm{PD}(n)$ denotes the open cone of positive definite matrices.

There are two cases. Either $f_A$ has full rank, i.e., the image of $f_A$ is $\mathbb{R}^{n\times n}$ and in this case it is a homeomorphism. Otherwise, $f_A$ does not have full rank, i.e., the image of $f_A$ is at most $n^2-1$ dimensional. In this case, the image of $f_A$ is a proper linear subspace and, thus; cannot contain an open set.

The above discussion reduced the problem to compute the rank of $f_A$.

Claim: The rank of $f_A$ is $n^2$ if and only if $A$ does not have two eigenvalues, $\lambda,\mu$ such that $\lambda+\mu=0$. Equivalently, the Jordan normal form of $A$ does not have two diagonal entries, $\lambda,\mu$ with $\lambda+\mu=0$.

Proof: Suppose $g\in\mathrm{GL}_n$ is an invertible matrix such that $B:=gAg^{-1}$ is the Jordan normal form of $A$.

After vectorization, $f_A = I\otimes A+A\otimes I$ as a $n^2\times n^2$-matrix, where $\otimes$ denotes the Kronecker product.

Now, $(g\otimes g)f_A(g^{-1}\otimes g^{-1})=f_{gAg^{-1}}=f_B$. Since $g\otimes g$ and $g^{-1}\otimes g^{-1}$ are invertible, $f_A$ and $f_{B}$ have the same rank. Therefore, we only need to show that $f_{B}$ has full rank if and only if $B$ does not have two diagonal entries that sum to zero.

Now, the diagonal entries of $f_{B}=I\otimes B+B\otimes I$ are exactly sums of pairs of diagonal entries of $B$. Note that $f_B$ is a $n\times n$-block matrix where each block is $n\times n$. Moreover, since $B$ is upper triangular, so is $f_B$. Thus, we need to show that there are no zeros on the diagonal of $f_B$. The $ii$-th block of $f_B$ (on the diagonal) is $B_{ii}I+B$. The diagonal entries of this block are exactly $B_{ii}+B_{jj}$ where $j=1,2,\dots,n$. Hence, $f_B$ is invertible if and only if there are no indices $1\leq i,j\leq n$ with $B_{ii}+B_{jj}=0$. This finishes the proof.

Edit: In the first version of the answer I mistakenly claimed that $f_A$ and $f_{A^{-1}}$ are inverses to each other.

0
On

Denote the set of all real symmetric $n\times n$ matrices by $\mathscr S_n$ and its subset of all positive definite matrices by $\mathscr P_n$. Define a linear map $f:M_n(\mathbb R)\to M_n(\mathbb R)$ by $f(X)=AX+XA^T$. Since $f(X)$ is symmetric whenever $X$ is symmetric, $f$ induces a mapping $g:\mathscr S_n\to\mathscr S_n$ defined by $g(S)=f(S)$ for all $S\in\mathscr S_n$. We want to know whether $g(\mathscr P_n)$ is open in $\mathscr S_n$. While this is primarily a question about $g$, as we shall see shortly, the answer is also related to the behaviour of $f$.

The matrix representation of $f$ with respect to the standard basis of $M_n(\mathbb R)$ is $I_n\otimes A+A\otimes I_n$. If the eigenvalues of $A$ (over $\mathbb C$) are $\lambda_1,\lambda_2,\ldots,\lambda_n$, the spectrum of $f$ consists precisely of sums of the form $\lambda_i+\lambda_j$ for all indices $i$ and $j$. So, one may determine the invertibility or non-invertibility of $f$ using the eigenvalues of $A$.

Suppose that $f$ is singular. We claim that the equation $AS+SA^T=0$ has a non-trivial symmetric solution $S$. First, as $f$ is singular, $AM+MA^T=0$ for some nonzero matrix $M\in M_n(\mathbb R)$. If $M$ has a nonzero symmetric part $S$, we are done. Otherwise, $M$ is skew-symmetric and the condition $AM+MA^T=0$ implies that $AM$ is symmetric. By a change of orthonormal basis in $\mathbb R^n$, we may assume that $M=\pmatrix{K&0\\ 0&0}$ for some invertible skew-symmetric matrix $K$. Partition $A$ as $\pmatrix{X&Y\\ Z&W}$ accordingly. Since $AM$ is symmetric and $K$ is invertible, we must have $Z=0$ and $XK=S'$ for some symmetric matrix $S'$. Then $S'$ is nonzero whenever $X$ is nonzero. Also, $XS'=S'K^{-1}S'$ is skew-symmetric. Now let $$ S= \begin{cases} \pmatrix{S'&0\\ 0&0}&\text{when $X\ne0$},\\ \pmatrix{I&0\\ 0&0}&\text{when $X=0$}.\\ \end{cases} $$ Then $S=S^T\ne0$ and $AS$ is skew-symmetric. Therefore $S$ is a non-trivial symmetric solution to $AS+SA^T=0$.

So, if $f$ is singular, so is $g$. Hence $g(\mathscr S_n)$ is a proper subspace of $\mathscr S_n$. Therefore $g(\mathscr P_n)$, being a subset of this proper subspace, is not open in $\mathscr S_n$.

Now suppose that $f$ is invertible. Then $g$ is also invertible. For any $S\in g(\mathscr P_n)$, let $P= g^{-1}(S)$. Since $g$ is invertible, we known that $P\succ0$ and $\varepsilon=\frac{\lambda_\min(P)}{2\|g^{-1}\|_2}>0$. For any $H\in\mathscr S_n$ with $\|H\|_2<\varepsilon$, we have $\|g^{-1}(H)\|_2\le\|g^{-1}\|_2\|H\|_2<\|g^{-1}\|_2\,\varepsilon<\lambda_\min(P)$. Hence $P+g^{-1}(H)$ is positive definite and $S+H=g(P+g^{-1}(H))\in g(\mathscr P_n)$. That is, the open ball $B(S,\epsilon)$ (in the topological space $\mathscr S_n$) is a subset of $g(\mathscr P_n)$. Since $S$ is arbitrary, we conclude that $g(\mathscr P_n)$ is open in $\mathscr S_n$.