The following is exercise number 10 in Hungerford's Algebra, page 190:
Let $R$ be a ring with no zero divisors such that for all $r, s \in R$ there exist $a, b \in R$, not both zero, such that $ar + bs = 0$.
a) If $R = K \oplus L$ (module direct sum) then $K = 0$ or $L = 0$
b) If $R$ has an identity, then $R$ has the invariant dimension property.
I have proved part a), but I'm having difficulty with b). I realize that it suffices to show that $R^m \cong R^n$ implies $m = n$, but I can't seem to use part a) to conclude this.
I think you might have been bamboozled into thinking that you should use a) to prove b), when really you can do it independently. Let's first note that the original hypothesis could have been written as: "if $r,s$ are nonzero, then there exists nonzero $a,b$ such that $ar=bs$."
Let the ring $R$ be as above and have an identity. Without loss of generality, suppose that $m>n$. Choose bases, and express the isomorphism in terms of an $n\times m$ matrix $A$ and an $m\times n$ matrix $B$ over $R$, such that $AB=I_n$ and $BA=I_m$.
Now: the main thing about the hypothesis about $ar=bs$ that strikes me is that we can perform elementary row operations on matrices and turn entries to zero until we are in row-echelon form. We proceed to use this ability to arrive at a contradiction.
The row combining matrices will look like modified identity matrices, where one of the diagonal elements is replaced by some nonzero element of $R$, and then above or below that entry, there will be another nonzero element from $R$, and elsewhere the matrix is zero.
We claim that such an elementary matrix has kernel zero. To see why, notice we can do another row operation on the left of it to change it into a diagonal matrix with nonzero diagonal, and that surely has kernel zero, so the original row operation must also have kernel zero.
We will also need row-swapping matrices, but these are just permutation matrices with entries in $\{0,1\}$, and they all obviously have kernel zero.
So, it is possible to find a matrix $X$ (with kernel zero) such that $XB$ is in row-echelon form. Since $B$ has more rows than columns, $XB$ has rows of zeros at the bottom. But look: $(XB)A=X$. On one hand, the zeros sitting at the bottom of $XB$ should result in zero rows at the bottom of $X$ in this multiplication. On the other hand, $X$ has right kernel zero, so it can't have zero rows! This is a contradiction.
(Edit note: this solution is completely different from the first one I offered, which didn't work as anticipated.)
(More superfluous information for those who happen to be interested.)
A high level way of phrasing this problem (which is, no doubt, not Hungerford's goal at this point in his book) is that the stated condition is the left Ore condition on a domain (with identity). It turns out that this makes the domain a left uniform ring, and hence $R^n$ has left uniform dimension $n$. At this point, the original problem amounts to the task of determining that finite left uniform dimensions are well-defined.
Now, a) is actually pretty mundane: all domains with identity have that property. But left Ore domains have a much nicer property that kind of looks like that:
This result obviously implies the invariant dimension property. Actually, this result holds for any ring $R$ with finite uniform left dimension.