I'm stuck on this exercise and any help would be well appreciated:
Let $R$ be a commutative ring with ideals $I,J$. Show that $R=I+J$ if and only if $\phi(x)= (x + I, x + J)$ is surjective from $R$ to $R/I \times R/J$.
Assuming surjectivity I got as far as to realize that I need to show $R\subset I+J$ since $I+J\subset R$ always holds (as they are both subrings of R). Now I'm not sure how to proceed.
Also, I don't even know where to begin in the direction $R=I+J$ implies $\phi$ is onto.
Again, any help, pointers or references are much appreciated.
Thanks in advance.
If $R=I+J$ then for $a_i\in R$ we get $x_i\in I$ and $y_i\in J$ such that $a_i=x_i+y_i$ ($i=1,2$), so $(a_1\bmod I,a_2\bmod J)=\phi(y_1+x_2)$.
Conversely, if $\phi$ is surjective and $a\in R$ then $(a\bmod I,0\bmod J)=\phi(x)$, and thus $a=(a-x)+x\in I+J$.