The coordinate map $g_1 : V \rightarrow \mathbb R^n$ is an isomorphism. Usually this is illustrated by taking a vector $v\in V$, expressing it as a linear combination of basis vectors $v=a_1v_1 + a_2v_2 + \ldots + a_nv_n$, and then forming the coordinate vector $\begin{bmatrix}a_1&a_2&\ldots & a_n\end{bmatrix}^T$. I also learned that $\mathcal{M} : \mathcal L(V,W) \rightarrow M_{m\times n}(\mathbb R)$ is an isomorphism. Based on that, what is the matrix $A\in M_{m\times n}$ corresponding with $g_1$?
I'm wondering if it's even a sensible question. $A$ relies on bases for $V$ and $\mathbb R^n$ so my guess was to take the same basis for $V$ and use it for $\mathbb R^n$. Then the result is the identity matrix $I$.
I'm even more confused since I've seen that a matrix is supposed to stand in for a composition of transformations looking something like $g_2 \circ f \circ g_1^{-1}$ for $f : V \rightarrow W$, $g_1$ as defined, and $g_2 : W \rightarrow \mathbb R^m$. But if $g_1$ and $g_2$ can have corresponding matrices, then those could be seen as some other composition of 3 functions, and so on forever.
You haven't defined what $\mathcal{M}$ is so it doesn't make sense to say $\mathcal{M}:\mathcal{L}(V,W)\to M_{m\times n}(\Bbb{R})$ is an isomorphism. What does make sense is to say that $\mathcal{L}(V,W)\cong M_{m\times n}(\Bbb{R})$ (assuming $\dim V=n,\dim W=m$), i.e the two vector spaces are isomorphic.
For example, if you fix a basis $\beta=\{v_1,\dots, v_n\}$ for $V$, and you consider the corresponding coordinate map $g_{\beta}:V\to \Bbb{R}^n$, then you're right this is an isomorphism of $V$ onto $\Bbb{R}^n$. Also, you're perfectly correct in your second paragraph: matrix representations for $g_{\beta}$ depend on the choice of basis on the domain, $V$, and the target, $\Bbb{R}^n$. For instance, if you choose $\beta$ on the domain and $\sigma=\{e_1,\dots, e_n\}$, the 'standard' basis on the target $\Bbb{R}^n$, then indeed the matrix representation is the identity $[g_{\beta}]_{\beta}^{\sigma}=I_n$ (afterall, the definition of the linear map $g_{\beta}$ is designed to make this equation true).
If for some weird reason you decide to mess with the bases, for example, using say $\beta'=\{2v_1,v_2,\dots, v_n\}$ and $\sigma'=\{e_1,2e_2,\dots, 2e_n\}$, then \begin{align} [g_{\beta}]_{\beta'}^{\sigma'}&= \begin{pmatrix} 2 & 0& 0 &\dots & 0\\ 0 & \frac{1}{2}& 0 & \dots & 0\\ 0 & 0 &\frac{1}{2} & \dots & 0\\ \vdots & \vdots & \vdots & \ddots & \vdots\\ 0 & 0 & 0 & \dots & \frac{1}{2} \end{pmatrix}. \end{align} So the result is no longer the identity matrix (but it is still an invertible matrix, as we expect, since $g_{\beta}$ is an isomorphism).
Regarding the triple composition, here's what's going on. Say you have a linear map $f:V\to W$, where $\dim V=n,\dim W=m$ and say you fix bases $\beta$ for $V$ and $\gamma$ for $W$. Then, we have linear isomorphisms $g_{\beta}:V\to\Bbb{R}^n$ and $g_{\gamma}:W\to\Bbb{R}^m$. This gives a corresponding linear map $F:\Bbb{R}^n\to\Bbb{R}^m$ defined by the composition $F=g_{\gamma}\circ f\circ g_{\beta}^{-1}$. Now, the statement is that \begin{align} [F]_{\sigma_n}^{\sigma_m}&=[f]_{\beta}^{\gamma}, \end{align} i.e the matrix representation of the composed map $F:\Bbb{R}^n\to\Bbb{R}^m$, with respect to the standard bases $\sigma_n,\sigma_m$ on $\Bbb{R}^n,\Bbb{R}^m$, is equal to the matrix representation of $f:V\to W$ with respect to the bases $\beta,\gamma$ on $V,W$. The above equation holds because of composition corresponds to multiplication of matrices once the correct bases are chosen: \begin{align} [F]_{\sigma_n}^{\sigma_m}&=[g_{\gamma}\circ f\circ g_{\beta}^{-1}]_{\sigma_n}^{\sigma_m}\\ &=[g_{\gamma}]^{\sigma_m}_{\gamma}\cdot[f]_{\beta}^{\gamma}\cdot[g_{\beta}^{-1}]_{\sigma_n}^{\beta}\\ &=I_m\cdot [f]_{\beta}^{\gamma}\cdot (I_n)^{-1}\\ &=[f]_{\beta}^{\gamma}. \end{align}