If $K$ is finite field, then one can easily show that there is no proper subring $R$ with $Q(R) = K$, where $Q(R)$ is the field of fractions of $R$. As a consequence, algebraic extensions $K$ of finite fields have no subring $R$ with $Q(R) = K$.
If $K / \mathbb{Q}$ is an algebraic extension, we have $Q(R) = K$ for the ring $$ R := \{ x \in K : \exists f \in \mathbb{Z}[X] \text{ normed }, f(x) = 0\}.$$ In a more general way, if $L / K = Q(S)$ is an algebraic extension, then the similarly defined ring $$R := S^L := \{ x \in L : \exists f \in S[X] \text{ normed }, f(x) = 0\}$$ satisfies $Q(R) = L$. Moreover $R$ is integrally closed (every zero $x \in L$ of normed polynomials $f \in S[X]$ is already in $R$) and $R$ is a field if and only if $S$ is a field.
Since purely transcendental field extensions are quotient fields of proper subrings by definition, we see that every field except the algebraic field extensions of finite fields can be written as $K = Q(R)$ with a proper subring $R \subseteq K$.
I now wonder if there is some kind of inverse of this procedure.
Let $L/K$ be an algebraic field extension with $Q(R) = L$. We assume that $R$ is integrally closed. We define $R_K := R \cap K$. Then $R_K$ is again an integral domain as an intersection of integral domains. But what do we know about $Q(R_K)$? Is it equal to $K$ or is the extension $K / Q(R_K)$ at least algebraic? Do we have $(R_K)^L = R$?
If $K / Q(R_K)$ is algebraic we have the inclusion $(R_K)^L \subseteq R$ (which is kind of obvious) and we again have that $R$ is a field (so $R = L$) if and only if $R_K$ is a field. Maybe closely related is the question, if there is a smallest integral domain $R \subseteq L$ with $Q(R) = L$.
Let $f,g,h\in k[X]$ such that $$\dfrac{f(X+\dfrac{1}{X})}{g(X+\dfrac{1}{X})}=h(X).$$ Set $m=\deg f$ and $n=\deg g$. Then we have $$\dfrac{X^mf(X+\dfrac{1}{X})}{X^ng(X+\dfrac{1}{X})}=X^{m-n}h(X).$$ Set $f_1(X)=X^mf(X+\dfrac{1}{X})$ and $g_1(X)=X^ng(X+\dfrac{1}{X})$. We have $f_1,g_1\in k[X]$, $\deg f_1=2m$, $\deg g_1=2n$, and moreover, $\gcd(X,f_1(X))=\gcd(X,g_1(X))=1$. From $$\dfrac{f_1(X)}{g_1(X)}=X^{m-n}h(X)$$ we get $m=n$ if $m\ge n$, and therefore $\deg h=0$, or $X^{n-m}f_1(X)=g_1(X)h(X)$ if $n>m$, and then $h(X)=X^{n-m}h_1(X)$, so $f_1(X)=g_1(X)h_1(X)$, a (degree) contradiction.