Sometimes formulas in linear algebra are not easy to remember. Some usefulness for the process of remembering can provide application of mnemonics.
- Do you know some useful mnemonics for this purpose?
I'll give two examples:
- For the process of finding the inverse of matrix it could be used mnemonic Detminstra what can be translated as 1. calculate determinant 2. for every entry find minors with the sign 3. transpose obtained matrix.
- The other example is Avvedia what is the shortcut for the formula with eigenvectors $AV=VD$. Knowing this formula we can easily obtain formula $A=VDV^{-1}$ or twin formula for diagonal matrix $D=V^{-1}AV$- sometimes $V$ and $V^{-1}$ can be erroneously interchanged - with Avvedia it is easier to check correctness of a formula.
What other mnemonics could be useful in linear algebra?
Added lately
- Furorri: concerning existence of right inverse for full row rank matrix (analogously for full column rank matrix would be Fucorlin) - these two inverses easy to erroneously interchange.
(Too long for a comment.)
I agree with some commenters here. Before you can build up muscle memory, it is often easier, or even faster, to derive what you need than to recall mnemonics. And derivation also makes you understand better. At least this is my own experience when linear algebra is concerned.
In recent years, the only formula that I almost need some mnemonics to help remembering is the formula for calculating the determinant of a block-$2\times2$ matrix when two adjacent sub-blocks commute. Consider $$ M=\pmatrix{A&B\\ C&D}, $$ where the four sub-blocks are square submatrices of identical sizes over some commutative ring. When some two adjacent sub-blocks of $M$ commute, we have (c.f. John Silvester, Determinants of Block Matrices) $$ \det M= \begin{cases} \det(AD-BC) & \text{ if } C,D \text{ commute},\\ \det(DA-CB) & \text{ if } A,B \text{ commute},\\ \det(DA-BC) & \text{ if } B,D \text{ commute},\\ \det(AD-CB) & \text{ if } A,C \text{ commute}. \end{cases} $$ This is analogous to the formula $\det\pmatrix{a&b\\ c&d}=ad-bc$, but care must be taken here because the orders of $A,B,C,D$ in the polynomials above (i.e. $AD-BC$ etc.) depend on which sub-block commutes with which.
Kind of messy, right? But if you truly understand how they are derived, you don't need any mnemonics. First, we use Gaussian elimination to eliminate the off-diagonal block among the pair of commuting sub-blocks. E.g. in the first case above, i.e. when $C$ and $D$ commute, we have $$ \pmatrix{A&B\\ C&D}\pmatrix{D&0\\ -C&I}=\pmatrix{AD-BC&B\\ 0&D}.\tag{1} $$ Take determinants on both sides, we get $\det(M)\det(D)=\det(AD-BC)\det(D)$. Cancelling out $\det(D)$, we get the result.
At this point, the derivation still looks tedious. However, note that in our derivation, the second block column of $(1)$ does not really matter to our end result. So, to find the right polynomial we need, all we only need to calculate $$ \pmatrix{A&B\\ C&D}\pmatrix{D\\ -C}. $$ In other words, when we have a row of commuting sub-blocks, we use a block column vector to kill off the off-diagonal commuting block ($C$ in this example), and the only thing that you need to memorise is the following:
With this in mind, it is now dead easy to see what polynomial to use in each of the above four cases: $$ \begin{cases} \pmatrix{A&B\\ C&D}\pmatrix{D\\ -C}=\pmatrix{AD-BC\\ 0} & \text{ if } C,D \text{ commute},\\ \\ \pmatrix{A&B\\ C&D}\pmatrix{-B\\ A}=\pmatrix{0\\ DA-CB} & \text{ if } A,B \text{ commute},\\ \\ \pmatrix{D&-B}\pmatrix{A&B\\ C&D}=\pmatrix{DA-BC&0} & \text{ if } B,D \text{ commute},\\ \\ \pmatrix{-C&A}\pmatrix{A&B\\ C&D}=\pmatrix{0&AD-CB} & \text{ if } A,C \text{ commute}. \end{cases} $$