Show that the map $f : S^1 \times S^1 \longrightarrow S^1 \times S^1$ defined by $(z,w) \mapsto (z^{am} w^{bn}, z^{cm} w^{dn})$ is a covering map of degree $mn$ where $ad - bc = 1$ and $a,b,c,d,m,n \in \mathbb Z.$
Our instructor says that it can be written as a composition of covering maps of finite degree and hence it's a covering map. But I can't see how can it be written as composition of covering maps.
Any help in this regard would be warmly appreciated. Thanks for investing your valuable time on my question.
Define maps $\phi, \psi : S^1 \times S^1 \to S^1 \times S^1$ by $$\phi(z,w) = (z^n,w^m), $$ $$\psi(u,v) = (u^av^b,u^cv^d) .$$ Clearly $f = \psi \circ \phi$.
The maps $\omega_k : S^1 \to S^1, \omega_k(z)= z^k$, are covering maps of degree $p$. We have $\phi = \omega_n \times \omega_m$. We now invoke the following
Theorem. Let $p : \tilde X \to X$ and $q : \tilde Y \to Y$ be covering maps. Them $p \times q$ is a covering map. If $\deg p = n$ and $\deg q = m$, then $\deg p \times q = nm$.
This is well-known and the proof is very easy.
We conclude that $\phi$ is a covering map of degree $mn$.
In order that the claim in your question be true, $\psi$ should be a covering map of degree $1$. But such maps are homeomorphisms. This heuristic consideration shows that we should prove that $\psi$ is a homeomorphism.
Given a $(2 \times 2)$-matrix $A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}$ with entries in $\mathbb Z$, let us define $$A^* : S^1 \times S^1 \to S^1 \times S^1, A^*(u,v) = (u^av^b,u^cv^d) .$$
Clearly the unit matrix $E$ gives $$ E^* = id \tag{1} .$$ Let us next show that for $B = \begin{pmatrix} e & f \\ g & h \end{pmatrix}$ one has $$(A \cdot B)^* = A^* \circ B^* \tag{2} .$$
We have $A \cdot B = \begin{pmatrix} a e + b g & a f + b h \\ c e + d g & c f + d h \end{pmatrix}$, thus $$(A \cdot B)^*(u,v) = (u^{a e + b g} v^{a f + b h}, u^{c e + d g }v^{c f + d h}) ,$$
$$A^*(B^*(u,v)) = A^*(u^ev^f,u^gv^h) = ((u^ev^f)^{a}(u^gv^h)^{b}, (u^ev^f)^{c}(u^gv^h)^{d}) \\= (u^{e a} v^{f a} u^{g b} v^{h b},u^{e c} v^{f c} u^{g d} v^{h d}) = (u^{e a + g b}v^{f a + h b},u^{e c + g d}v^{f c + h d}) .$$
This proves $(2)$.
Formulae $(1)$ and $(2)$ show that if $A$ is invertible and $A^{-1}$ has all entries in $\mathbb Z$, then $A^*$ is a homeomorphism.
$\det A = ad - bc \ne 0$ means that $A$ is invertible as a real metrix. But we even have $\det A = 1$ which implies that $A^{-1}$ has all entries in $\mathbb Z$.
Since $\psi = A^*$, we are finished.
Update:
That $A^{-1}$ has integer entries follows from
Lemma. Let $A$ be an $(n \times n)$-matrix with entries in $\mathbb Z$. Then $A$ has an inverse $A^{-1}$ with entries in $\mathbb Z$ if and only if $\det A = \pm 1$.
Proof. 1. For all matrices $A, B$ with entries in $\mathbb R$ we have $\det A \cdot B = \det A \cdot \det B$. If both $A, B$ have entries in $\mathbb Z$, then also $\det A, \det B \in \mathbb Z$.
Assume that $A$ has entries in $\mathbb Z$ and has an inverse $A^{-1}$ with entries in $\mathbb Z$. Then $$1 = \det E = \det A \cdot A^{-1} = \det A \cdot \det A^{-1} \tag{3}.$$ Since both $\det A, \det A^{-1} \in \mathbb Z$, $(3)$ can only be satisfied when $\det A = \det A^{-1} = 1$ or $\det A = \det A^{-1} = -1$.
Assume that $A$ has entries in $\mathbb Z$ and has $\det A = \pm 1$. Then $A$ is invertible as a real matrix. It inverse is given by $$A^{-1} = \frac{1}{\det A} \left(-1)^{i+j} A_{ij}\right)^T \tag{4}$$ where the $A_{ij}$ are the minors of $A$ and $^T$ denotes transposition. The minors are the determinants of the $((n-1) \times (n-1))$-submatrices of $A$ obtained by deleting the $i$-th row and $j$-th column. If $A$ has entries in $\mathbb Z$ then all $A_{ij} \in \mathbb Z$ and therefore $A^{-1}$ has entries in $\mathbb Z$ since $\det A = \pm 1$.