From Algebra, Chapter $0$ by Aluffi:
I think the first part of this exercise goes as follows:
Suppose $S$ is linearly independent in $M$ over $R$. Suppose $k_1s_1 +\cdots + k_ns_n=0$ where $k_i \in K(R)$ and $s_i \in S$. Since $k_i \in K(R)$, then $k_i=a_i/b_i$ where $a_i, b_i \in R$ and $b_i \ne 0$. So, $$0=k_1s_1 +\cdots +k_ns_n=(a_1/b_1)s_1+\cdots+(a_n/b_n)s_n$$ So, $(a_1b_2\cdots b_n)s_1+(a_2b_1b_3\cdots b_n) s_2+\cdots + (a_nb_1\cdots b_{n-1}) s_n=0$ implies $a_1b_2\cdots b_n=a_2b_1b_3\cdots b_n=\cdots = a_nb_1\cdots b_{n-1}=0.$
Since $R$ is an integral domain and since any product of $b_i$'s is not $0$, then $a_1=a_2=\cdots=a_n=0$. So, $k_i =0$ for all $i$. So, $S$ is linearly independent in $V$ over $K(R)$.
The converse is trivial.
I am stuck on the second part of this exercise: Rank of $M$ as an $R$-module equals the dimension of $V$ as a $K(R)$-vector space.
Since $M= R^{\oplus A}$, the rank of $M$ is $|A|$. However, how do we know that if $A$ generates $M$ over $R$, then $A$ will generate $V$ over $K(R)$?
By the exercise I proved above, all we know is that $A$ will be linearly independent in $V$ over $K(R)$.
Why isn't is possible to add a vector in $V\setminus M$ to $S$ and it still be linearly independent in $V$ over $K(R)$?


You've proved that $A$ is linearly independent in $V$ as a $K$-vector space, so $\dim_R M \leq \dim_K V$.
Now fix a basis of $V$. Crucially, we can multiply each of its elements by its denominator to get a basis with all of its elements lying in $M$. By the statement you proved, this set must be linearly independent over $R$ too, so $\dim_K V \leq \dim_R M$, as required.