Complete (?) list of row reduction applications

60 Views Asked by At

I continue to be amazed that row reduction is still, by far, the single most useful technique I've learned for solving linear algebra problems. Even as I do practice problems for my qualifying exam, my first thought is, "Is there something I can row reduce?" So I thought it might be fun/interesting to come up with something like a "complete" list of applications for row reduction, and I'd like to ask if you have anything to add!

Here's what I've got:

Basic Applications

  1. Find a basis for the row space
  2. Find a basis for the span of a set of vectors
  3. Find a basis for the column space (i.e. range of a transformation)
  4. Find the rank of a matrix
  5. Find a basis for the null space
  6. Solve $Ax = 0$
  7. Solve $Ax = b$
  8. Find the inverse of a matrix
  9. Calculate $\det A$
  10. Check if a set of vectors is linearly independent

Advanced or Niche Aplications

  1. Compute the LU, LDU, or LDL decompositions of a matrix
  2. Per Linear Algebra Done Wrong, diagonalize the matrix of a quadratic form
  3. For symmetric/Hermitian matrices, find the number of each positive, negative, and zero eigenvalue (as these match the signs of the pivots)
  4. By extension of 13, test if a symmetric/Hermitian matrix is positive definite
  5. Per Hoffman and Kunze, showing that the characteristic polynomial equals the minimal polynomial of a companion matrix.
  6. Per Hoffman and Kunze, one can use row reduction (and properties of multilinear functions) to show that $\det \begin{bmatrix} A & B \\ 0 & C \end{bmatrix} = \det A \det C$ for the block matrix

It is applications like the last six that I am looking to add to this list (or basic applications that I have missed). I have done a lot of googling, but unfortunately one finds dozens of sources aimed at very new linear algebra students, and I thought maybe Stack could supply some interesting/surprising/advanced applications I hadn't thought of! Thanks for your input!

1

There are 1 best solutions below

2
On BEST ANSWER

Another non-listed case:

The computation of a product of the form $CA^{-1}B$ where $C,A,B$ have resp. dimensions $(m \times n), \ \ (n \times n), \ \ (n \times p).$

(such a product occurs in particular for the computation of a Schur complement)

This computation is done by Gaussian elimination on the following matrix:

$$\begin{pmatrix}A & B \\ -C & 0\end{pmatrix} \to \begin{pmatrix}A' & B' \\ 0 & D'\end{pmatrix}$$

using operations "zeroing" the first $n$ columns below $A$ giving $D'=CA^{-1}B$.

Reference: Lemma 10.6 p. 269 of "Fundamental Problems of Algorithmic Algebra" by Chee Keng Yap, Oxford University Press, 2000.