Interesting/Useful tricks in linear algebra.

2.2k Views Asked by At

This may be an opinion based-based question, since everyone understands things differently, but what are some interesting/useful tricks in linear algebra?

For example, if we know that the determinant of a matrix is equal to 0 0 , then we that the matrix is not invertible, the rows and columns of the matrix are linearly dependent, and so on.

Are there any others, that would be useful to have on a cheatsheet for a linear algebra exam?

2

There are 2 best solutions below

3
On

My top 3 (which many of my students keep forgetting/not understanding):

  1. You can perform elementary row operations by multiplying a given matrix with the appropriate regular matrix.
  2. Linear transformations and matrices are "the same thing" and questions can be translated between the two relatively easily.
  3. You can define a linear transformation by defining it on a basis, no need to come up with a general formula.
1
On
  1. Rank factorization, you can write any matrix $\textbf{A}_{m\times n}=\textbf{B}\textbf{C}$, where $\rho(A)=r$ and $\textbf{B}$ is an $m\times r$ and $\textbf{C}$ is an $r\times n$ matrix. If you use properly it is a very useful property of matrix. For positive definite matrix you can have $\textbf{A}_{n\times n}=\textbf{B}^{T}\textbf{B}$.

  2. Block matrix, $\textbf{A}= \left[\begin{matrix} \textbf{A}_{11} & \textbf{A}_{12} \\ \textbf{A}_{21} & \textbf{A}_{22} \\ \end{matrix}\right]$, Now suppose we required $\textbf{A}_{11}$. then consider $\textbf{B}=\left[\begin{matrix}\textbf{B}_1 \\ \textbf{B}_2\end{matrix}\right]$ and $\textbf{C}=\left[\begin{matrix}\textbf{C}_1 & \textbf{C}_2\end{matrix}\right]$, then $\textbf{A}=\left[\begin{matrix}\textbf{B}_1 \\ \textbf{B}_2\end{matrix}\right] \left[\begin{matrix}\textbf{C}_1 & \textbf{C}_2\end{matrix}\right] = \left[\begin{matrix} \textbf{B}_{1}\textbf{C}_1 & \textbf{B}_{1}\textbf{C}_2 \\ \textbf{B}_{2}\textbf{C}_1 & \textbf{B}_{2}\textbf{C}_2 \\ \end{matrix}\right]$, therefore, $\textbf{A}_{11} = \textbf{B}_{1}\textbf{C}_1$.

  3. Spectral decomposition of matrix $\textbf{A}$. That is $\textbf{A}_{m\times n}=\textbf{B}^{-1}\textbf{D}\textbf{B}$, where $\textbf{D}=$ diagonal matrix using eigen-value of $\textbf{A}$.

  4. Some properties of kronecker product, such that $(\textbf{A}\otimes \textbf{B})(\textbf{C}\otimes \textbf{D})=\textbf{AC}\otimes \textbf{BD}$. Using this identity with spectral decompostion, you can observe many useful property, such as kronecker product of to positive definite matrix is positive definite.

  5. Hadamard product. there are some claver use of Hadamard product to show a matrix positive definite. As a example, suppose $\textbf{A}= \left[\begin{matrix} a & b \\ c & d \\ \end{matrix}\right]$ and $\textbf{B}= \left[\begin{matrix} e & f \\ g & h \\ \end{matrix}\right]$, and $\textbf{A}$ and $\textbf{B}$ are positive definite matix. Then what can we say about the matrix $ \left[\begin{matrix} ae & bf \\ cg & dh \\ \end{matrix}\right]$? Obviously it is a positive definite matrix, but how can we show that? Consider $\textbf{A}\otimes \textbf{B}$, this is positive definite matrix. now $\textbf{Z}_{2\times 4}$ a elimentary matrix. then $\textbf{Z}(\textbf{A}\otimes \textbf{B})\textbf{Z}^{T}$ is positive definite. you can choose $\textbf{Z}$ such a way that $\textbf{Z}(\textbf{A}\otimes \textbf{B})\textbf{Z}^{T}=\left[\begin{matrix} ae & bf \\ cg & dh \\ \end{matrix}\right]$ is positive definite.

Here, $\left[\begin{matrix} ae & bf \\ cg & dh \\ \end{matrix}\right] = \textbf{A}\circ\textbf{B}$, Hadamard product of $\text{A}$ and $\textbf{B}$.

last but not the least, There are some useful identity in linear algebra such as Woodbury identity ,Schur compliment etc. you can find useful.