Let $M$ be a finitely generated $R$-module, $I\subset R$ an ideal and $\phi: M\rightarrow M$ an $R$-module linear map $\phi(M) \subseteq IM$. Then $\phi$ satisfy $$\phi^n+a_1\phi^{n-1}+\ldots+a_n=0$$ where $a_i\in I$ for $1\leq i\leq n$.
The proposition is from Atiyah/MacDonald, Proposition 2.4, page 21
Reading the proof, I understand it goes as follows
Let $\omega_1,\ldots,\omega_n$ be generators $M$. For every $\phi(\omega_i)\in IM$, we have $a_{ij}\in I$ and so $\phi(\omega_i)=\sum_{j=1}^n a_{ij}\omega_j$. We can then write $$\sum_{j=1}^n(\delta_{ij}\phi-a_{ij}\omega_j)=0,\ \ \ \ \textrm{with}\ 1\leq i\leq n$$ Where $\delta_{ij}$ is the Kronecker delta function. In matrix form we get $Cx=0$, where $C$ is a $n\times n$ matrix s.t. $C_{ij}=\delta_{ij}\phi-a_{ij}$ and $x=(\omega_1,\ldots,\omega_n)^T$.
Multiplying $Cx=0$ with $Adj(C)$ from the left and substitute $Adj(C)C=\det(C)E$ we get $\det(C)x=0$, meaning $\det(C)=0$. Since $\det(C)$ is the characteristic polynomial evaluated at $\phi$, we get the equation.
Now my problem is, that I don't see how/where we use that $\phi$ should be a map from $M$ to $M$ and that it should be a $R$-module linear map? Why is these conditions necessary? Where in the proof should it be used?
Writing $\delta_{ij}\phi-a_{ij}\omega_j$ is meaningless. Let's be more careful.
Identify the framework
The map $\phi$ is an endomorphism of $M$, so it belongs to its endomorphism ring $\operatorname{End}_A(M)$. Also the maps $\bar{a}\colon x\mapsto ax$, for $a\in A$, belong to the endomorphism ring and the subring $S$ of $\operatorname{End}_A(M)$ generated by these maps and by $\phi$ is obviously commutative.
The main argument
For every $i$, we can write $$ \phi(\omega_i)=\sum_{j} a_{ij}\omega_j=\sum_{j} \overline{a_{ij}}(\omega_j) $$ with $a_{ij}\in I$. On the other hand, $$ \phi(\omega_i)=\sum_{j}\delta_{ij}\phi(\omega_j) $$ and therefore $$ \sum_j (\delta_{ij}\phi-\overline{a_{ij}})(\omega_j)=0 $$
Then the entries $\delta_{ij}\phi-\overline{a_{ij}}$ define a matrix $C$ with entries in the ring $S$.
Such a matrix has an adjugate $C'$ that satisfies $C'C=\det(C)E$ (where $E$ is the identity matrix).
In particular, $\det(C)\omega_i=0$, for every $i$ and so $\det(C)$ is an endomorphism of $M$ that vanishes on all generators of $M$: this means $\det(C)=0$.
The rest is clear, because the determinant is of the form $\phi^n+\overline{a_{1}}\phi^{n-1}+\dots+\overline{a_{n-1}}\phi+\overline{a_n}$ and, since all coefficients $a_{ij}$ belong to $I$, also $a_1,\dots,a_n\in I$ (standard application of Laplace expansion and the theory of characteristic polynomials).
Final comment
What Atiyah and Macdonald denote by $\phi^n+a_1\phi^{n-1}+\dots+a_n$ is the same as what I denoted above more accurately with $\phi^n+\overline{a_{1}}\phi^{n-1}+\dots+\overline{a_n}$. It's quite common to abuse notation and “identify” $a\in A$ with the map $\bar{a}\colon x\mapsto ax$ (an endomorphism of $M$).