Linear independence of the basis of $F/IF$

91 Views Asked by At

I'm working through the exercises of Rotman's Homological Algebra book. Exercise 2.12 wants to prove that if $I$ is a maximal ideal in a commutative ring $R$ and $X$ is a basis of a free $R$-module $F$ then $F/IF$ is a vector space over $R/I$ and has basis $\{ x + IF \mid x \in X \}$.

I had no troubles except proving that the basis is linearly independent. For that we take $x_1, \dots, x_n \in X$ and $r_1, \dots, r_n \in R$ with $$ \sum_{i = 1}^{n}{(r_i + I)(x_i + IF)} = (\sum_{i = 1}^{n}{r_ix_i}) + IF = 0 $$ and $x_i - x_j \notin IF$ for $i \neq j$. Thus, $\sum_{i = 1}^{n}{r_ix_i} \in IF$. Now, I argue that every element in $IF$ can be written as $\sum_{j=1}^{m}{s_jx_j}$ for $s_j \in I$ and $x_j \in X$: Any element of $IF$ can be written as $\sum_{j=1}^{m}{s_jy_j}$ for $s_j \in I$ and $y_j \in F$. Now, as $X$ is a basis of $F$, every $y_j$ can be written as linear combination of elements of $X$. Inserting these combinations and using the properties of an ideal yields the desired formula.

Back to the proof, I can write $\sum_{i = 1}^{n}{r_ix_i} = \sum_{j=1}^{m}{s_jx_j'}$ with $s_j \in I$ and $x_j' \in X$. Now, reordering terms, and setting $r_i$ or $s_i$ to $0$ if not defined wlog we can write $\sum_{i = 1}^{n}{r_ix_i} = \sum_{i=1}^{n}{s_ix_i}$. But $X$ is a basis, so we have that $r_i = s_i \in I$.

My problem is now that I never used that the $x_i$ come from different cosets. So, if $x_1$ and $x_2$ come from the same coset I would have that $1x_1 - 1x_2 \in IF$, but $1 \notin I$, but I do not see where my argument above exactly breaks.

2

There are 2 best solutions below

0
On BEST ANSWER

It is not possible that $x_1-x_2\in IF$. By your discussion how every element from $IF$ is written, you get that $1\in I$. So, you are not using that $x_i$'s come from different cosets; this is a consequence of linear independence which you proved.

9
On

An element of $IF$ can be written as $$ y=\sum_{k=1}^n a_ky_k $$ with $a_k\in I$ and $y_k\in F$. By definition of basis, $$ y_k=\sum_{x\in X}r_{x,k}x $$ (only a finite number of $r_{x,k}$ are nonzero) so we have $$ y=\sum_{x\in X}\Bigl(\,\sum_{k=1}^n a_kr_{x,k}\Bigr)x $$ Hence we can write $$ y=\sum_{x\in X}s_{x}x \tag{*} $$ with $s_x\in I$. The converse is clearly also true. Note that the coefficients are unique.

Now, if $$ \sum_{x\in X}(r_x+I)(x+IF)=0 $$ we get $$ \sum_{x\in X}r_xx\in IF $$ and therefore $r_x\in I$, for every $x\in X$.

This proves linear independence (I'm afraid your proof is wrong from the beginning).

The problem you seem to have about $x_i-x_j\notin IF$ is a non issue, really. Anyway, there are no distinct elements $x,y\in X$ such that $x+IF=y+IF$, because this would imply $x-y\in IF$, so, by the characterization (*), $1\in I$, which is a contradiction.