Given a vector space $V$ over a field $K$, and $f,g\in End(V)$, let $A=[\alpha_{ij}]$ be the representative matrix of $f$ with respect to the basis $B=\{\underline a_1, \underline a_2, \cdots, \underline a_n\}$ and let $A'=[\alpha'_{ij}]$ be the representative matrix of $g$ with respect to the basis $B'=\{\underline a'_1, \underline a'_2, \cdots, \underline a'_n\}.$
We have $f=g$ iff $A=CA'C^{-1}$, where $C=[\gamma_{ij}]$ is the representative matrix of $h\in Aut(V)$ with respect to $B$, such that $h(\underline a_i)=\underline a'_i$ for $i=1,2,\cdots,n$.
I have some notes with the following proof, which I'm failing to grasp for the absence of explanations
I don't understand the second equality, and the following ones, which are similar in nature

The matrix of a linear transformation $\varphi$ with respect to basis $B=(a_1,\dots,a_n)$ has $\varphi(a_i)$, coordinated in $B$, as its $i$th column.
To verify this, observe that $a_i$ coordinated in $B$ is just the $n$-tuple $e_i$ of scalars having $1$ in the $i$th coordinate and $0$ elsewhere, and that if a matrix $M$ has columns $m_1,\dots,m_n$, we get $M\cdot e_i=m_i$.
So that, in the exercise, we have the coordinates of $h(a_k)$ as the $k$th column of $C$: $$[h(a_k)]_B=[h]_B[a_k]_B=Ce_k=(k\,\text{th column of }C)=\pmatrix{\gamma_{1k}\\ \vdots\\ \gamma_{nk}}$$ So that, $h(a_k)\ =\ \sum_i\gamma_{ik}a_i$.
Similarly, apply the same for $f$.
Note that a matrix multiplication on the right can be executed columnwise, in that $M\cdot[u_1\,|\,\dots\,|\,u_n]\ =\ [Mu_1\,|\,\dots\,|\,Mu_n]$, so we have $$AC=[f(a'_1)\,|\,\dots\,|\,f(a'_n)]_B$$ On the other hand, $CA'=C\cdot[f(a'_1)\,|\,\dots\,|\,f(a'_n)]_{B'}$, while $C\pmatrix{\vartheta_1\\ \vdots\\ \vartheta_n}=\left[\sum_i\vartheta_ia'_i\right]_{B}$.
Putting together, $CA'=[f(a'_1)\,|\,\dots\,|\,f(a'_n)]_B$.