The Problem: Suppose $R$ is an integral domain. Suppose $\{x_1,\dots, x_m\}$ is a set of linearly independent elements in a $R$-module $A$, and $\{y_1,\dots, y_n\}$ is a set of linear independent elements in a $R$-module $B$. Let $M$ be the module generated by $\{(x_1, 0),\dots. (x_m, 0), (0, y_1),\dots, (0, y_n)\}$. Show that any linearly independent set in $M$ has at most $m+n$ elements.
My Attempt: Suppose, for the sake of contradiction, that $\{(u_1, v_1),\dots, (u_{m+n+1}, v_{m+n+1})\}$ is linearly independent in $M$. Then the set $\{u_1,\dots, u_{m+n+1}\}$ has at most $m$ linearly independent elements; WLOG, say they are all in $\{u_1,\dots, u_m\}$. Similarly, the set $\{v_{m+1},\dots, v_{m+n+1}\}$ has at most $n$ linearly independent elements; WLOG, say they are all in $\{v_{m+1},\dots, v_{m+n}\}$. I wish to show that there exist $a_1,\dots, a_{m+n+1}$ not all zero such that $a_1(u_1, v_1)+\dots+a_{m+n+1}(u_{m+n+1}, v_{m+n+1})=0$; but somehow get dead stuck. Any hint would be greatly appreciated.
Without loss of generality, let $x_1, \ldots, x_m$ generate $A$, and $y_1, \ldots, y_n$ generate $B$. Also assume WLOG there are no repeats in the lists $x$ and $y$. Then $A$ is a free module of rank $m$, and $B$ is a free module of rank $n$. Then $M = A \times B$ is a free module of rank $n + m$.
In general, consider a free module $M$ of rank $j$, and suppose we have a linearly independent list $z_1, \ldots, z_j, z_{j + 1} \in M$. We can apply extension of scalars functor for the inclusion $R \subseteq Frac(R)$, the field of fractions of $R$, to this setup to obtain a vector space $M’$ of dimension $j$ with a list of $j + 1$ independent elements. This, we know from linear algebra, is a contradiction.