About Jacobson's proof of the structure theorem for f.g. modules over a PID

76 Views Asked by At

I'm reading Jacobson's Basic Algebra I, and am at the part where he first proves the structure theorem for f.g. modules over a PID. There's a couple of details I can't get past which I suspect are a typo, or perhaps some sloppy writing, so I'd like to clarify.

Statement and first portion of the proof

It seems this theorem has a couple of different formulations, so here's the one we are working with in particular: a non-zero f.g. module $M$ over a PID $D$ is a direct sum of cyclic modules $$ M = Dz_1 \oplus Dz_2 \oplus \cdots \oplus Dz_s $$ such that $\operatorname{ann}(z_i) \supset \operatorname{ann}(z_{i + 1})$ and $\operatorname{ann}(z_i) \neq D$.

The proof proceeds by fixing generators $x_1, \ldots, x_n$ of $M$ and a basis $(e_1, \ldots, e_n)$ for the free module $D^n$. Some previous work in the book shows that, via the epimorphism $\eta: e_i \mapsto x_i$, we have $M \cong D^n/K$, where $K = \ker{\eta}$ is itself f.g., and so has generators $f_1, \ldots, f_m$, say. We of course have a representation $$ f_j = \sum_{i = 1}^{n} a_{ji} e_i $$ for each of these generators in terms of the $e_i$, and we define the $m \times n$ matrix $A = (a_{ji})$. (Jacobson calls this matrix the relations matrix of the $f_j$ with respect to the $e_i$.) We can now change to a new basis $e_1', \ldots, e_n'$ of $D^n$ and new generators $f_1', \ldots, f_m'$ of $K$, where $$ e_i' = \sum_{j = 1}^{n} p_{ij} e_j, \qquad f_k' = \sum_{l = 1}^{m} q_{kl} f_l $$ and the $n \times n$ matrix $P = (p_{ij})$ and $m \times m$ matrix $Q = (q_{kl})$ are both invertible in their respective matrix rings. By some more previous work in the book, we can arrange that $QAP^{-1}$, which is the relations matrix of the $f_k'$ with respect to the $e_i'$ is in Smith normal form with invariant factors $d_1, \ldots, d_r$. Concretely, these relations are $f_i' = d_i e_i'$ for $1 \leq i \leq r$, and $f_i' = 0$ for $r < i \leq m$. We now consider the $n$ images $$ y_i = \eta(e_i') = \sum_{j = 1}^{n} p_{ij} x_j, $$ which in fact generate $M$. Also, for $1 \leq i \leq r$ we have $$ d_i y_i = d_i \eta(e_i') = \eta(d_i e_i') = \eta(f_i') = 0. $$ Now, suppose we had some relation $$ \sum_{i = 1}^{n} b_i y_i = \eta\left(\sum_{i = 1}^{n} b_i e_i'\right) = 0, \qquad b_i \in D. $$ Then immediately $\sum_{i = 1}^{n} b_i e_i' \in K$.

My question

Jacobson says that we hence have $$ \sum b_i e_i' = \sum c_i f_i' = \sum c_i d_i e_i', $$ and since the $e_i'$ are a basis, this implies $b_i = c_i d_i$ for $1 \leq i \leq n$. This doesn't make any sense to me. The lack of indices in the summation (which is directly as it appears in the text) is a bit troubling, but even more is that we started with $r$ invariant factors, but now we somehow have statements which make reference to $n$ of them. What gives? The only way I can see that chain of equality with the missing summation indices being true is if Jacobson means to say $$ \sum_{i = 1}^{n} b_i e_i' = \sum_{i = 1}^{m} c_i f_i' = \sum_{i = 1}^{r} c_i d_i e_i', $$ because we only have $m$ generators, and all but the first $r$ are in fact zero. By comparing coefficients of $e_i'$, we now have $b_i = c_i d_i$ for $1 \leq i \leq r$, and $b_i = 0$ for $r < i \leq n$. This conclusion is different though.