I am reading Soare's book and I am trying to understand the following statement: A function f has r.e. degree iff it is the limit of a recursive sequence ${f_{s}}_{s \in \mathbb{N}}$ and its modulus of convergence $m \leq_{T}f$.
However, I am trying to find an example to illustrate this theorem. Is it useful to identify the difference between recursive sets and r.e. set?
Sure! The classic example of this is the characteristic function of the Halting Problem, which is recursively enumerable but not recursive.
This function, $f$, is defined as: $f(x)=1$ if $\varphi_x(x)$ halts, and $f(x)=0$ otherwise (where $\{\varphi_e: e\in\mathbb{N}\}$ is some standard enumeration of the partial computable functions).
Now $f$ is a limit of recursive functions in a natural way. Let $f_s(x)=1$ if $\varphi_x(x)$ halts in at most $s$ stages, and $f_s(x)=0$ otherwise. Then each $f_s$ is recursive, and their limit is $f$: $f(x)=1$ iff for all sufficiently large $s$, we have $f_s(x)=1$. So the quoted result says that $f$ has r.e. degree (and indeed, $f$ itself is r.e.).
So this is an example of a function with r.e. degree being the limit of a sequence of recursive functions. In general, though, such an $f$ need only have $\Delta^0_2$ degree! (This is Shoenfield's Limit Lemma.) And there are lots of $\Delta^0_2$ degrees which are not r.e. degrees. In order to ensure that the limit of a sequence of recursive functions is of r.e. degree, we need to add a technical condition; this is the requirement that the modulus of convergence (that is, at what point $t=m(n)$ we see $f(n)$ "stabilize" - $f_s(n)=f(n)$ for all $s>t$) be "simple". And indeed, it's easy to see that in the example above, the modulus of convergence of $f$ is indeed $\le_Tf$.