Decomposition of non-square matrices into invertible square matrices

82 Views Asked by At

$A \in R^{n \times m}$, $~S \in GL_{n}(R)$, $~T \in GL_{m}(R)$

$I(r) \in R^{n \times m~}$ is identity matrix padded with zeros where $0 \le r \le \min(n,m)$

Example: $I(2) \in R^{3,4}$ would be

$\begin{pmatrix} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 \end{pmatrix}$

Show the following decomposition is always possible for any A.

$A = S * I(r) * T$

Is r uniquely determined by A?


What I have so far: the dimensions of the multiplication make sense. If we multiply S and $I(r)$ first, we ignore the rest of the columns past column $S_r$. Then I got that $A_{ij} = \sum_{k=1}^{r}S_{ik}T_{kj}$ but I don't see why this decomposition is always allowed. Thank you very much in advance.

In addition, some related ideas might be how $A^tA$ becomes square. Maybe this is like SVD but with more restrictions on the diagonal. This is from an intro to linear algebra and abstract algebra course, primarily using Artin's Algebra textbook.

1

There are 1 best solutions below

0
On BEST ANSWER

Let us suppose that the SVD always exists and consider $A = U \Sigma V^\top$. Then both $U, V$ are orthogonal (hence invertible) of the appropriate dimensions, and $\Sigma$ has structure $$ \Sigma = \begin{pmatrix} \sigma_1 \\ & \ddots \\ & & \sigma_r \\ & & & & \end{pmatrix} \in \mathbb{R}^{n \times m} $$ where we assume $\sigma_1 \geq \ldots \geq \sigma_r > 0$. This means that we may factor $\Sigma = \widehat{\Sigma} I(r)$, where

$$ \widehat{\Sigma} = \mathrm{diag}(\sigma_1, \ldots, \sigma_r, 1, \ldots, 1) \in \mathbb{R}^{n \times n}, $$ where we have artificially padded $1$ entries to stretch $\widehat{\Sigma}$ to an appropriate size. Then for your case, define $S = U \widehat{\Sigma}$ and $T = V^\top$, and $r$ is the rank of $A$.