Understand vector matrix size requiement in multiplication

447 Views Asked by At

Hi I’ve recently started reading a book about deep learning (machine learning). It talks about multiplying vectors using dot product and elementwise multiplication. For instance, taking 3 inputs(vector) and 3 weights(vector) and multiplying. From my understanding, when multiplying vectors the size needs to be identical. The concept I’m have hard time understanding is multiplying vectors by a matrix. The book gives an example of an 1x4(vector) being multipled by 4x3(matrix). The output is an 1x3 vector. Im am confused because I assumed multiplying vector by matrix needs the same number of columns as well but have read that the matrixes need rows equal to the vectors columns. This is confusing to me because if I do not have equal number of columns how does my deep learning algorithm multiply each input in my vector by a corresponding weight?

1

There are 1 best solutions below

6
On

For any matrix product $C=AB$, think of the element $c_{ij}$ as the dot product of the $i$th row of $A$ with the $j$th column of $B$. For that dot product to make sense, the “inner” dimensions of the matrix product—the number of columns of $A$ and the number of rows of $B$—have to match. The output matrix $C$ will have the “outer” dimensions—the same number of rows as $A$ and the same number of columns as $B$. You might think of these inner dimensions of $A$ and $B$ “canceling” when you multiply the matrices together.