Constructing a model of set-thory via forcing requires dealing with some complications, e.g., either "names" for elements of the model, or boolean-valued models, or sheaves, or something along those lines.
I am looking for an application of forcing, where the concept appears in close to a "pure" form. This is for pedagogical reasons. The audience will be familiar with the basic concepts of first-order logic.
I was wondering if there are examples in recursion theory. It's easy to define a subset of $\mathbb{N}$ that is generic with respect to the countable family of r.e. sets of conditions (or alternately, arithmetic conditions). This would require minimal background from the audience. Are there any interesting results based on this approach? It wouldn't matter if there are "non-forcing" proofs of the same results, so long as the "forcing" proofs are short and sweet.
There are many examples. Very quickly they get quite complicated - e.g. I think the Low Basis Theorem might not be pedagogically appropriate - but there are indeed examples.
In particular, the easiest (in my opinion) proof that there are incomparable Turing degrees is a forcing argument. Specifically, one shows that for any pair of finite binary strings $\sigma,\tau$ and any $e\in\mathbb{N}$, there are extensions $\sigma'\supseteq\sigma$ and $\tau'\supseteq\tau$ such that there are no infinite binary sequences $f\supseteq\sigma', g\supseteq\tau'$ with $\Phi_e^f=g$. This lets us inductively build a pair of infinite binary sequences which are Turing incomparable, by diagonalizing against possible reductions (the phrase "finite extensions" is often used here). The Friedberg-Muchnik theorem improves on this by making the degrees c.e. by moving from "pure" forcing to the more complicated setting of priority arguments; see this old question.
Now, you brought up the issue of "generic" versus "generic for some specific collection of dense sets" (e.g. the r.e. ones). There is a rich hierarchy of "levels of genericity" in computability theory, and this is also the case for non-Cohen forcing notions (unfortunately the word "generic" in computability theory usually means Cohen-generic). In particular, examining the argument above we see the following:
The dense sets we want to meet are those of the form "No infinite extension of the left string (resp. right) computes any infinite extension of the right string (resp. left) via $\Phi_e$."
On the face of it, each of these is quite complicated - namely, $\Pi^1_1$. There are still only countably many such dense sets, so we can trivially meet them all.
... But we might also want to see how simple we can go! There are two ways that a computation can "fail:" it can make an error, or it can fail to give any answer at all. This lets us redefine the dense sets we care about: "EITHER there is already some $n$ such that $\Phi_e^{left}(n)$ halts, $right(n)$ is defined, and $\Phi_e^{left}(n)\not=right(n)$, OR there is some $n$ such that no extension of the left string makes $\Phi_e^?(n)$ halt." (And similarly, swapping "left" and "right.")
This definition is easily checked to be $\Sigma^0_2$. So we have:
Here's another example. I'm not sure this fits the bill - the proof is quite complicated if one hasn't gotten comfortable with forcing in computability theory already - but the forcing itself is simple, the question is easy to state (although the answer is a bit complicated), and the whole situation is quite interesting.
Let's say we want to build a fast-growing function $\omega\rightarrow\omega$. There is a natural forcing which leaps to mind (there are many variations on this idea):
Conditions are pairs $(p, f)$ with $p$ a map from some finite subset of $\omega$ to $\omega$, and $f:\omega\rightarrow\omega$ is total with $p\subseteq f$.
The ordering is given by $(p, f)\le (q, g)$ (remember, "$\le$" means "is stronger than") if:
$p\supseteq q$,
$f(n)\ge g(n)$ for all $n$, and
for each $m$ in $dom(p)\setminus dom(q)$, we have $p(m)\ge g(m)$.
Basically, what's going on here is the following. We're trying to build a function $\omega\rightarrow\omega$. In a condition $(p, f)$, the $p$-part is the amount of the function we've built so far, and the $f$ part is a "promise" that from now on the function we're building will grow at least as fast as $f$.
This is called Hechler forcing. It's not the only forcing that adds a fast-growing function, of course, but it is in my opinion the most natural (and it has some nice set-theoretic universality properties - e.g. Truss showed that if $V[d]$ is a forcing extension of the ground model $V$ where $d:\omega\rightarrow\omega$ dominates every function in $V$, and $c\in\omega^\omega$ is Cohen generic over $V[d]$, then $d+c$ is Hechler generic over $V$).
Now, an important class of questions in computability theory are of the form:
For example, it's not hard (thinking about the argument above) to show that two sufficiently mutually Cohen generic reals don't compute anything in common besides the computable sets (they form a "minimal pair"), so there are no specific reals which "being sufficiently Cohen generic" lets me compute.
One particularly natural question of this form, phrased in the contrapositive, is:
It turns out that the set of such reals has a very snappy characterization:
The hyperarithmetic sets are essentially those which can be computed by taking repeated Turing jumps - possibly infinitely many jumps, but only "computably many." (There are many other characterizations as well.) Precisely defining this is tricky, but it's not too hard to $(i)$ show that "the $\omega$th jump of $0$" has an obvious definition and $(ii)$ argue that stuff gets really weird if we try to take the $\alpha$th jump of $0$ for a really really complicated countable ordinal $\alpha$ (namely, any which doesn't have a computable copy).
The proof of the theorem above - due to Solovay if I recall correctly - is quite nice. Showing that every hyperarithmetic real is computable from a sufficiently fast-growing function is a proof by transfinite induction, using a generalized notion of the Busy Beaver functions. The converse is the interesting direction, and involves a clever analysis of $\Vdash_{Hechler}$. If you're interested, Peter Gerdes' thesis is a good place to start reading if I recall correctly.