Consider $W$ a vector space of dimension $n$ with basis $\{e_i\}_{i=1}^n$, and a linear transformation from $W$ to $W$ with the matrix representation $M^i_j = M(e_j)^i$, i.e the $i^th$th coefficient in the basis representation of $M(e_j)$). Show that $M$ is covariant in $j$ and contravariant in $i$.
I am using the approach: let $\bar{D}=\{d_i\}_{i=1}^n$ be any new basis, then there is a matrix $A$ such that $D=EA$ where $E$ is a matrix that has the basis $e_i'$s as column and $D$ has the basis $\bar{D}$ as column. Moreover, by uniquness of presentation of $d_i$ by the basis $\{e_i\}$, $A$ is invertable, so $E=DB$ where $B=inv(A)$. Now let $1 \leq i,j \leq n$, $$M^i_j=M(e_j)^i=M(B_j^id_i)^i=B^i_jM(d_i)^i$$ where $B^i_j$ denotes a sum. I am not sure how to go fourther and even show if it is covariant in $j$, any help would be appreciated.
I read ${M^i}_j=M(e_j)^i$ as "$i$ is the row index of $M$ and $j$ is the column index of $M$".
This could be the short rephrase of the sloppy statement that "$M$ is covariant in $j$ and contravariant in $i\,.$"
Hopefully the long answer makes clearer what this really means. In fact, since only matrices and vectors are involved, this is all standard linear algebra.
Writing a vector in the two different bases $$ \boldsymbol{v}=v^ie_i=u^id_i $$ and using $ e_i={F^j}_id_j $ we get $ v^i{F^j}_i\,d_j=u^id_i $. It follows that the components of a vector transform as $$\tag{1} \boxed{ v^i{F^j}_i=u^j\,.} $$ You can read this as a matrix multiplication $FV=U$ where $V,U$ are a column vectors. Sometimes this transformation is called contravariant to distinguish it from what follows.
The linear map $W$ that $M$ represents maps $e_j$ to ${M^i}_j\,e_i\,.$ (In this basis the map $W$ is represented by $M\,.$) The map $W$ maps also $d_j$ to ${A^i}_j\,d_i$ (in this basis the map $W$ is represented by $A$.) Using again $e_i={F^j}_i\,d_j$ shows $$ W(e_j)={M^i}_j\,e_i={M^i}_j\,{F^k}_i\,d_k\,. $$ Writing ${F_j}^i$ for the inverse of $F$ we get $d_j={F_j}^ie_i$ so that $$ W(d_j)={A^i}_j\,d_i\,. $$ But $W$ must map $v^je_j$ and $u^jd_j$ so the same vector $W(\boldsymbol{v})\,.$ Therefore, $v^j\,W(e_j)=u^j\,W(d_j)$ or $$ v^j\,{M^i}_j\,{F^k}_i\,d_k=u^j\,{A^i}_j\,d_i\,. $$ Using the inverse of (1) and relabelling a few dummy indices, $$ u^n{F_n}^j\,{M^i}_j\,{F^k}_i\,d_k=u^n\,{A^k}_n\,d_k\,. $$ Therefore, $$\tag{2} \boxed{{F_n}^j\,{M^i}_j\,{F^k}_i={A^k}_n\,.} $$ Because of the similarity of ${M^i}_j\,{F^k}_i$ with (1) someone has decided to say $M$ is contravariant in $i$ because the row index $i$ is contracted with ${F^k}_i$ from (1). Since the column index $j$ is contracted with the inverse ${F_n}^j$ it makes a lot of sense to call this covariant.
What it does not mean is that a column of $A$ is the contravariant image of a column of $M\,.$ The correct transformation is two sided as shown by (2).
As a simple exercise I suggest to derive the transformation law for a dual vector, that is a linear map $\boldsymbol{g}:\boldsymbol{v}\to\boldsymbol{g}(\boldsymbol{v})\in\mathbb R\,.$ This should be covariant: $$ g_i{F_j}^i=h_j\,. $$ Are $\boldsymbol{g}$ and $\boldsymbol{h}$ better represented as column vectors or as row vectors?