Is the set of matrices with rank at most $2$ closed?

148 Views Asked by At

Let $$S := \left\{ {\bf X} \in \Bbb R^{4 \times 4} \mid \operatorname{rank} ({\bf X}) \leq 2 \right\}$$ Is $S$ closed with respect to the usual Euclidean metric?


I think so. My reasoning is the following:

Let $({\bf A}_n)$ be a sequence in $S$ converging to a matrix $\bf A$. We observe that the row operations performed in order to get row echelon form are continous. Let ${\bf A}'_n$ denote the row echelon form of ${\bf A}_n$. Since the rank is preserved, there are at most two nonzero rows in ${\bf A}'_n$. Let ${\bf A}'_n = \phi({\bf A}_n)$ for some continous function $\phi.$ Since $A_n$ converges, we must have that ${\bf A}_n'$ converges. Since ${\bf A}'_n$ converges, it follows that its corresponding entries also converge, so the limiting matrix will have at most $2$ nonzero rows, or equivalently, a rank of at most $2$. The limit matrix must be the row echelon form of ${\bf A}$, so ${\bf A}$ is in $S$.

Is there any problem with the above argument?

2

There are 2 best solutions below

0
On BEST ANSWER

I do not think that your proof is correct. You claim that there is a "universal sequence of row operations" transforming all matrices into echelon form. But it seems to me that we need different sequences of row operations for different matrices. Only if you can prove that there actually exists a universal sequence of row operations, you would get a continuous function $\phi$ assigning to each matrix $A$ a matrix $\phi(A)$ in echelon form. I doubt that, but maybe I am wrong.

So what can be done? I think Didier's comment is an essential ingredient. We know that a matrix $A$ has rank $\le 2$ iff all $3 \times 3$-minors $A^{i,j}$ obtained from $A$ by eliminating row $i$ and column $j$ have vanishing determinant. Now consider a sequence $(A_n)$ of matrices with rank $\le 2$ which converges to a matrix $A$. Consider all sequences $(A^{i,j}_n)$ of $3 \times 3$-minors. They converge to the minors $A^{i,j}$ of $A$. Since the determinant is continuous, all $A^{i,j}$ have vanishing determinant which means that $A$ has rank $\le 2$.

0
On

I'm going to address only your posted question, regarding whether there is a problem with your argument.

Yes, there is a big problem.

It is true that any individual row operation is a continuous operation on the space of $4 \times 4$ matrices, i.e. a continuous function $\mathbb R^{16} \mapsto \mathbb R^{16}$. This is true for more-or-less the reason you explained in the comments. The way I would say it is that an individual row operation is just left-multiplication by an individual elementary matrix $E$: $EA$ is the result of applying the given row operation to $A$, and when $E$ is fixed, $EA$ is a continuous function of $A$.

And it is still true that if one fixes one particular sequence of row operations, then that fixed sequence represents a continuous function on the space of $4 \times 4$ matrices. By fixing the sequence of row operations, one is fixing a sequence of elementary matrices $E_1,...,E_k$ to multiply on the left by: $E_1 A$ is the result of applying the first row operation, $E_2 E_1 A$ is the result of applying the second row operation, and so on. Taking the product of that sequence elementary matrices $M = E_k \cdots E_1$ one gets a fixed invertible matrix to multiply on the left by: $MA$ is the row echelon form of $A$, and $MA$ is a continuous function of $A$.

One way that this is summarized in linear algebra is by the "$LU$-factorization theorem": for every matrix $A$ there is a factorization $A=LU$ where $U$ is the echelon form of $A$, and $L$ is a square, invertible matrix, and $M=L^{-1}$ is the product of the sequence of elementary matrices that are used in the row reduction process.

But here's the big problem. Suppose that I take two different matrices ${\bf A}_m$ and ${\bf A}_n$ in your sequence. There is no reason at all to expect their row operation sequences to be the same. In fact, it is certainly possible that every single matrix has a row operation sequence different from the others. One can express this in the language of the $LU$ factorization theorem: we do get factorizations ${\bf A}_n = L_n U_n$, and so we do get factorizations $L_n^{-1} {\bf A}_n = U_n$ where $M_n = L_n^{-1}$ is the product of the elementary matrices used to get ${\bf A}_n$ into row echelon form. But it is possible that every single $M_n$ matrix is different from the others.

Now, IF we knew that all of the $M_n$'s were the same then, perhaps, your argument could be made to work.

But that's a big IF. More likely, all of the $M_n$'s are not all the same, and your argument fails.