Matrix Multiplication not associative when matrices are vectors?

3.7k Views Asked by At

Wikipedia states:

Given three matrices A, B and C, the products (AB)C and A(BC) are defined if and only the number of columns of A equals the number of rows of B and the number of columns of B equals the number of rows of C (in particular, if one of the product is defined, the other is also defined)

Row and column vectors can be thought of as just special cases of matrices. So given the above I would expect:

$$(a^Tb)c = a^T(bc)$$

However the right side is undefined because you can’t multiply two column vectors, seemingly contradicting Wikipedia. Am I mistaken? If not, can we only consider matrix multiplication to be associative in contexts where we know no intermediate matrix becomes 1x1?

2

There are 2 best solutions below

4
On BEST ANSWER

The issue is that, technically, $(a^T b)c$ doesn't exist either. You see, we often pretend $a^T b$ is the scalar $k:=a\cdot b$, but it's really a $1\times 1$ matrix whose only entry is $k$. It's one thing to left-multiply $c$ by $k$; it's another to left-multiply $c$ by the $1\times 1$ matrix itself, which you can't do. If each of these vectors has $n$ entries with $n\ne 1$, $(a\cdot b)c=kI_n c\ne kI_1 c$ ($I_1 c$ is of course undefined), where $I_m$ is the $m\times m$ identity matrix.

0
On

If $a,b,c$ are vectors, say $n\times 1$, the statement of Wikipedia is not respected, since you multiply matrices $1\times n, n \times 1, n \times 1$.

But LHS exists (explained by @J.G. above).