How to find the $LU$ factorization of a matrix $A$ when elimination breaks down

842 Views Asked by At

Let's say I have the following matrix $A \in \mathbb{R}^{3\times3}$

$$A = \begin{bmatrix} x & x & x \\ x & x & x \\ x & x & x \\ \end{bmatrix}$$

How can I find the $LU$ factorization of this matrix $A$, if elimination breaks down, as each row is just a scalar multiple of each other?

The method I've used in the past to factorize $A$ into $LU$, has been:

  1. To reduce $A$ to $U$ by multiplying $A$ by elimination matrices $E_{ij}E_{kl}...E_{mn}$
  2. Then finding $L$ as the inverse of the product of those elimination matrices i.e. $L = E_{mn}^{-1}...E_{kl}^{-1}E_{ij}^{-1}$, but this method only works if elimination doesn't break down (i.e. no zero entries in pivot positions)

Is there a more general way to factorize matrices that allows elimination to break down and still find the $LU$ factorization?

3

There are 3 best solutions below

0
On BEST ANSWER

So $\begin{bmatrix} x & x & x \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}$ is what you get after you try to clear the first column below its pivot. This is already an upper triangular matrix which is row-equivalent to $A$, even though you "expected" to need to do more row operations to clear out the second column. So this matrix can serve as your $U$.

What did we do to make this happen? We subtracted row $1$ from row $2$ and we subtracted row $1$ from row $3$. Denote $E_{ij}(y)$ as the matrix with $1$s in the diagonal, $y$ in the $(i,j)$ position, and zeros elsewhere. (This is not standard notation, I just made it up.) Then $U=E_{31}(-1) E_{21}(-1) A$. $E_{31}$ and $E_{21}$ are invertible, so you can invert them to get $L$. You'll find $L=\begin{bmatrix} 1 & 0 & 0 \\ 1 & 1 & 0 \\ 1 & 0 & 1 \end{bmatrix}$.

So in this example elimination actually didn't break down, you just didn't need to use a second pivot to clear out the second column. By contrast, there are other cases where elimination does break down, in the sense that you have a zero pivot with nonzero entries below it. For example, $A=\begin{bmatrix} 1 & 1 & 1 \\ 1 & 1 & 2 \\ 3 & 1 & 1 \end{bmatrix}$. With this matrix, removing the $1$ in the $(2,1)$ position also removes the $1$ in the $(2,2)$ position, so the second column does not have a pivot in the correct position.

In this case a row exchange is required. A factorization which tracks this row exchange is commonly referred to as a $PA=LU$ decomposition, where $P$ is a permutation matrix. In Matlab we can write

[L,U,P]=lu(A). 

Another way to write it would be $A=(P^T L)U$; Matlab calls this aggregate $P^T L$ a "psychologically lower triangular matrix". This is what you get in Matlab if you make the obvious call to lu, i.e.

[L,U]=lu(A).
0
On

One obvious LU decomposition would be $$ \begin{pmatrix} x&x&x \\ x&x&x \\ x&x&x \end{pmatrix} = \begin{pmatrix} x&& \\ x&0& \\ x&0&0 \end{pmatrix} \begin{pmatrix} 1&1&1 \\ &0&0 \\ &&0 \end{pmatrix} $$

1
On

The matrix having at least one singular value = 0 (or too close to 0 numerically) is a prerequisite for row elimination breaking down. One could do an SVD and then do operations on the rows/columns corresponding to nonzero singular values. One could interpret Hennings factorization as such an SVD with an invisible $\left[\begin{array}{ccc}1&0&0\\0&0&0\\0&0&0\end{array}\right]$ matrix in between and then we can move over the x into singular position to get: $$\left[\begin{array}{ccc}1&0&0\\1&0&0\\1&0&0\end{array}\right]\left[\begin{array}{ccc}x&0&0\\0&0&0\\0&0&0\end{array}\right]\left[\begin{array}{ccc}1&1&1\\0&0&0\\0&0&0\end{array}\right]$$

Not exactly normalized yet, but you get the idea.