Let $f:R^n\mapsto R$ be a continuous function over compact set $[0,1]^n$, and $0<C_1\leq f(x)\leq C_2$ on this set.
For all $\beta\in[1,\infty)$, I try to give a bound (irrelative to $\beta$) of \begin{equation} \frac{\int_{[0,1]^n}\exp(2\beta f(x))d x_1\ldots d x_n}{\beta(\int_{[0,1]^n}\exp(\beta f(x))d x_1\ldots d x_n)^2}. \tag{1} \end{equation} Or equivalently, showing that \begin{equation} \lim\sup_{\beta\rightarrow\infty}\frac{\int_{[0,1]^n}\exp(2\beta f(x))d x_1\ldots d x_n}{\beta(\int_{[0,1]^n}\exp(\beta f(x))d x_1\ldots d x_n)^2}<\infty. \tag{2} \end{equation}
When $n=1$, I can show that (1) is bounded by some constants, which is only relative to $C_1, C_2$ and $n$. My sketch proof is to represent the numerator and denominator by a sequence of series, respectively. Here, I use Integration by parts and induction to obtain the sequence of series. However, the Integration by parts is hard for $n>1$.
Now, I want to extend my result to the case when $n>1$. I used to think the extension was easy. Here is my thought: $f$ is continuous and bounded in $[0,1]^n$, so we can rearrange the value of $f$ from small to large. Then, we can project the increasing values onto a one-dimension continuous function. Then, we can transform the multi-dimensional problem into the a one-dimension problem. However, I fail to find a reference to do such a thing. Can someone give a bound of (1) using the fact the (1) is bounded when $n=1$? I think induction may be helpful.
I also have another thought. I try to prove that there exists a one-dimension function $g$ ($g$ has to be independent of $\beta$) such that $\int_{[0,1]}\exp(2\beta g(y)) dy=\int_{[0,1]^n}\exp(2\beta f(x))d x_1\ldots d x_n$ and $\int_{[0,1]}\exp(\beta g(y)) dy=\int_{[0,1]^n}\exp(\beta f(x))d x_1\ldots d x_n$. Then, we can transform the multi-dimensional problem into the a one-dimension problem. This way also seems to be possible.
Here is an example in which the ratio diverges as $\beta \to \infty$. Let
$$ f(x) = \begin{cases} 0, & x = 0 \\ -\frac{1}{1-\log x}, & x \in (0, 1] \end{cases} $$
It is easy to check that $f$ is continuous and bounded on $[0, 1]$. Then
\begin{align*} M(\beta) &:= \int_{0}^{1} e^{\beta f(x)} \, \mathrm{d}x \\ &= e \beta^{1/2} \int_{0}^{\sqrt{\beta}} \frac{1}{y^2} e^{-\sqrt{\beta}(y+y^{-1})} \, \mathrm{d}y \tag{$y=\tfrac{\sqrt{\beta}}{1-\log x}$} \\ &= e \beta^{1/2} e^{-2\sqrt{\beta}} \int_{-1}^{\sqrt{\beta}-1} \frac{1}{(t+1)^2} e^{-\sqrt{\beta}t^2/(t+1)} \, \mathrm{d}t \tag{$t=y-1$} \\ &\sim e\sqrt{\pi} \beta^{1/4} e^{-2\sqrt{\beta}} \end{align*}
as $\beta \to \infty$. (The last step is easily verified by further substituting $s = \beta^{1/4}t$ or invoking Laplace's method.) Consequently,
$$ \frac{M(2\beta)}{M(\beta)^2} \sim C \beta^{-1/4} e^{(4-2\sqrt{2})\sqrt{\beta}} $$
as $\beta \to \infty$ for some positive constant $C$, whose value is irrelevant to our discussion. In particular,
$$ \lim_{\beta \to \infty} \frac{M(2\beta)}{\beta^k M(\beta)^2} = \infty \qquad \text{for any } k \in \mathbb{R}. $$