In Dummit & Foote, problem 14.2.17(c), the authors hand us a quadratic extension of the form $F(\sqrt D)$. Now, while I am pretty sure you need that $D \in F$ to do this particular problem, I can't help but wonder if assuming this is even necessary. In other words, if the degree of the extension $F(\sqrt D)/F$ is $2$, then must $D$ belong to $F$?
My thoughts so far are as follows: If $D \not\in F$, then $[F(D) : F] > 1$. This puts $\sqrt D \in F(D)$, otherwise $$ 4 \le [F(\sqrt D) : F(D)][F(D):F] = [F(\sqrt D): F] = 2. $$ This tells us that $F(D) = F(\sqrt D)$. My next observation was that if $\sqrt D$ has minimal polynomial $x^2 + ax + b$, we can write $\sqrt D$ in terms of $D$ and elements of $F$ and use this to find the minimal polynomial of $D$ over $F$: $$ \sqrt D = -(D + b)/a, \ m_{D, F}(x) = x^2 + (2ab - a)x + b^2. $$ Notice here that $a \neq 0$, or else $D \in F$!
This is false. Consider the case of $F=\Bbb{Q}$, $D=(1+\sqrt2)^2=3+2\sqrt2.$
We have $\sqrt{D}=\pm(1+\sqrt2)$, so $\Bbb{Q}(\sqrt D)=\Bbb{Q}(D)=\Bbb{Q}(\sqrt2)$.