How to multiply a vector from the left side with matrix?

29.1k Views Asked by At

I have always dealt with vector - matrix multiplication where the vector is the right multiplicand, but I am not sure how to apply the product between a matrix and a vector when the vector is the left multiplicand.

I have the following example

$$\beta = \begin{pmatrix} \beta_0 & \beta_1 \end{pmatrix} \in \mathbb{R}^{1 \times 2}$$

and a general matrix

$$A = \begin{pmatrix} a_{11} & a_{12} \\ a_{21} & a_{22}\end{pmatrix} \in \mathbb{R}^{2 \times 2}$$

What would be the algorithm to multiply $\beta \cdot A$? Of course the result is a $1 \times 2$ row vector.

2

There are 2 best solutions below

0
On BEST ANSWER

So essentially you wish to compute: $$ \begin{pmatrix} \beta_0&\beta_1 \end{pmatrix} \begin{pmatrix} a_{11}&a_{12}\\ a_{21}&a_{22} \end{pmatrix}.$$ This equals the following: $$\begin{pmatrix} a_{11}\beta_0+a_{21}\beta_1&a_{12}\beta_0+a_{22}\beta_1 \end{pmatrix} . $$ Hopefully it is clear how the multiplication works.

1
On

Matrix multiplication is defined so that the entry $(i,j)$ of the product is the dot product of the left matrix's row $i$ and the right matrix's column $j$.

If you want to reduce everything to matrices acting on the left, we have the identity $xA = \big(A^Tx^T\big)^T$ where $T$ denotes the transpose. This is because $(AB)^T = B^TA^T$, and the operation that sends a matrix to its transpose is self-inverse.