The following question has emerged from some research problems I've been thinking about. Suppose $1\geq s\geq 0$ is a real number and $f\in H^s(\mathbb{R})$ (the usual Bessel potential space) with $\|f\|_{L^2}=1$. Suppose now we set $F_d:= f^{\otimes d}$ and consider $$\frac{\|(-\Delta)^{\frac{s}{2}}F_d\|_{L^2}^2}{\|(-\Delta)^{\frac{s}{2}}f\|_{L^2}^2} = \frac{(4\pi)^{s}}{\|(-\Delta)^{\frac{s}{2}}f\|_{L^2}^2}\int_{\mathbb{R}^d}\Big(|\xi_1|^2+\cdots+|\xi_d|^2\Big)^{s} |\hat{f}(\xi_1)|^2 \cdots |\hat{f}(\xi_d)|^2d\xi_1\cdots d\xi_d. \tag{1}$$
Question. What is the optimal asymptotic of the ratio in (1) as $d\rightarrow\infty$?
When $s=0$, expression just becomes $1$. Similarly, when $s=1$, it is $d$. I conjecture that the answer is $d^{s}$, but it's not clear to me how to prove this. Note that since $\ell^{2s} \subset \ell^2$, we have that $$(|2\pi\xi_1|^2+\cdots+|2\pi\xi_d|^2)^s \leq (|2\pi\xi_1|^{2s}+\cdots+|2\pi\xi_d|^{2s}).$$ Therefore, $$\frac{\|(-\Delta)^{\frac{s}{2}}F_d\|_{L^2}^2}{\|(-\Delta)^{\frac{s}{2}}f\|_{L^2}^2} \leq d.$$ Since $z\mapsto |z|^s$ is concave, we have $$d^s(\frac{|2\pi\xi_1|^2}{d}+\cdots+\frac{|2\pi\xi_d|^2}{d})^s \geq d^s\Big(\frac{|2\pi\xi_1|^{2s}}{d} +\cdots+\frac{|2\pi\xi_d|^{2s}}{d}\Big),$$ and therefore $$\frac{\|(-\Delta)^{\frac{s}{2}}F_d\|_{L^2}^2}{\|(-\Delta)^{\frac{s}{2}}f\|_{L^2}^2} \geq d^s.$$
Funny problem. And actually, even if I usually love a lot analysis and a lot less probabilities, I think probabilities are the good framework to answer to the question here!
Indeed, your assumptions imply that $|\hat f|$ is a density of probability with bounded moments of order $s$. And if I take $(X_i)_{i=1..d}$ to be $d$ independent variables following the law $|\hat f|$ (which means that their joint law is what you call $F_d$) and I define $Y_i := |X_i|^2$, then your question can be written: what is the asymptotic limit of $$ \frac{\Bbb E((Y_1+\dots+Y_d)^s)}{\Bbb E(Y_1^s)} $$ But now if $f\in H^s$ (i.e. if $\Bbb E(|X_i|^{s/2}) < \infty$), this seems to be sufficient (see here or this post) to apply the law of large numbers, telling you that $$ \bar{\,Y}_d := \frac{Y_1+\dots+Y_d}{d} $$ converges in probability to $Y_1$. In particular, it should imply (at least if $f$ is sufficiently nice?) that $$ \frac{1}{d^s}\frac{\Bbb E((Y_1+\dots+Y_d)^s)}{\Bbb E(Y_1^s)} = \frac{\Bbb E((\bar{\,Y}_d)^s)}{\Bbb E(Y_1^s)} \underset{d\to\infty}{\rightarrow} 1 $$ or equivalently, in your notations, that $$ \frac{\|\Delta^{s/2}F_d\|_{L^2}^2}{\|\Delta^{s/2}f\|_{L^2}^2} \underset{d\to\infty}{\sim} d^s $$ Any comment is welcome as I do not trust myself sufficiently in probability theory, in particular about the exact hypotheses to get the convergence. But anyway it indicates that the answer should be $d^s$.