In Soare's Computability Theory and Applications, he gives a very quick proof that the following set is $\Sigma_3^0$-complete:
$$\text{Rec} := \{e \mid W_e \text{ is recursive}\}$$
It's fairly straightforward to show that Rec is $\Sigma_3^0$. To show that it's $\Sigma_3^0$-hard, Soare claims that it follows from four facts. Maybe I'm missing something obvious, but even granting these four facts, I still don't see why the claim follows. These facts are:
- The set $\text{Cpl} := \{e \mid W_e \equiv_T K\}$ is $\Pi_3^0$-hard.
- The set $\text{Cof} := \{e \mid W_e \text{ is cofinite}\}$ is $\Sigma_3^0$-complete.
- $\text{Cof} \subseteq \text{Rec}$.
- $\text{Rec} \cap \text{Cpl} = \emptyset$.
Facts 3. and 4. are easy to verify, and facts 1. and 2. can be proven via a movable marker construction. Any help on elucidating this proof would be appreciated.
Aside: In Theory of Recursive Functions and Effective Computability, Rogers gives a separate proof of Rec's $\Sigma_3^0$-completeness via a priority argument. I think I understand that proof (from which he goes on to infer Cof's $\Sigma_3^0$-completeness), so I'm convinced Rec is $\Sigma_3^0$-complete. I'm just not sure why it follows in the way Soare suggests.
Recall that in Soare's notation, $(A,B) \leq_1 (C,D)$ if and only if there is an injective computable function $f : \omega \rightarrow \omega$ such that $f(A) \subseteq C$ and $F(B) \subseteq D$ and $f(\overline{A \cup B}) \subset \overline{C \cup D}$.
The notation $(\Sigma_n, \Pi_n) \leq (C,D)$ means for all $A \in \Sigma_n$, $(A, \bar{A}) \leq_1 (C,D)$.
At least in my version of Soare's book, $Rec$ is $\Sigma_3$ complete is a corollary of the theorem one line above which states:
$(\Sigma_3, \Pi_3) \leq_1 (Cof, Cpl)$
So given any $\Sigma_3$ set $A$. There is a computable function $f$ such that if $x \in A$, then $f(x) \in Cof \subseteq Rec$ and if $x \notin A$, then $f(x) \in Cpl \subseteq (\omega - Rec)$. Hence this $f$ witnesses, $A \leq_1 Rec$. $Rec$ is $\Sigma_3$ complete.