Alternate convention in matrix multiplication?

189 Views Asked by At

I'm going through Halmos's Finite-Dimensional Vectorspaces. I noticed an oddity in a proof where the indexes seemed to be swapped when multiplying a matrix by a vector.

I went back about 30 pages to the point where he defines the convention, and after a lot of fiddling with his definition and the one I'm familiar with (which appears on Wikipedia), I've realized they differ.

His definition (page 65) is:

$Ax_j = \Sigma_i \alpha_{ij}x_i$

His definition gives that $i$ stands for rows and $j$ stands for columns, as we would expect. However, the definition we would expect under modern conventions is:

$Ax_j = \Sigma_i \alpha_{ji}x_i$

Where you sum over a dummy variable on the "inside" indexes.

It seems like Halmos is working almost exclusively with $n\times n$ matrices, so it's not a serious issue. (Each matrix is simply transpose of the one you would have under standard conventions). However, I was surprised.

Does anyone have any more insight or information on the conventions used in this book or historically? Does anyone use this convention still today?