Matrices for linear maps

38 Views Asked by At

Given a linear map $g : F^n \to F^m$ ,and standard bases of $F^n$ and $F^m$ denoted by ${e_{i}} $ and ${e_{j}^{'}}$ respectively, \begin{aligned} & g(e_{j})=\sum_{i=1}^m a_{i,j}(e_{i}^{'}) \\ \end{aligned} If v belongs to $F^n$ we have

\begin{aligned} & g(v)=g(\sum_{j=1}^n \beta_j (e_{j})) \\ &=\sum_{j=1}^n \beta_j g(e_{j})\\ &=\sum_{j=1}^n \beta_j \sum_{i=1}^m a_{i,j}(e_{i}^{'})\\ \end{aligned}

I don't understand how the next two steps follow from above : \begin{aligned} &=\sum_{i=1}^n(\sum_{j=1}^m a_{i,j}\beta_j )(e_{i}^{'})\\ & = Av \end{aligned}

How is the summation index change correct? And how are we getting matrix A and vector v from the second last step?

Edit :

Assuming the index change is a typo, what I understand of the second last step is $a_{i,j}$ are elements of matrix A, and for each column of A a certain $\beta_j$ is multiplied to the elements of the entire column, summed up and then multiplied to a basis vector $e_{i}^{'}$ which means we are getting a linear combination of basis vectors of $F^m$. But it is not clear to me how this is factored to A and v, where v is a linear combination of basis vectors of $F^n$

N.B. Taken from Matrix Theory book by David Lewis.

1

There are 1 best solutions below

1
On

The index change seems to be a typo, correct would be \begin{aligned} g(v)&=g(\sum_{j=1}^n \beta_j (e_{j})) \\ &=\sum_{j=1}^n \beta_j g(e_{j})\\ &=\sum_{j=1}^n \beta_j \sum_{i=1}^m a_{i,j}(e_{i}^{'})\\ &=\sum_{i=1}^m(\sum_{j=1}^n a_{i,j}\beta_j )(e_{i}^{'})\\ & = Av \end{aligned} The last identity is just the definition of the matrix vector product, where $A$ consists of the entries $a_{i,j}$.

Edit: Actually the last identity is a slight abuse of notation, since $v$ is not an element of $\mathbb R^m$. The vector of the coefficients $\vec \beta:=(\beta_j)_{j=1,\ldots,m}$ is though. So $Av$ here means the linear combination of the basis vectors $e'_j$ with coefficients $(A\vec\beta)_j$ i.e. $$Av=\sum_{j=1}^m(A\vec \beta)_je'_j$$