Consider a sequence of functions $f_N(x)$ for which I know $\limsup_{N \to \infty} \frac{1}{N} \log f_N(x) = -g(x)$, where $g(x)$ is positive, continuous on $(a, \infty)$, and increases to infinity as $x \to \infty$. Moreover, \begin{equation} g(x) \geq - \log(x)+V(x)+C \hspace{1cm} \text{for } x \geq a, \text{ and } C \text{ a positive constant} \end{equation} where $V(x)$ is a continuous function satisfying \begin{equation} \liminf_{x \to \infty} \frac{V(x)}{\beta \log(x) } >1 \hspace{1cm} \text{for } \beta >1 \end{equation}
I want to show that $\lim_{N \to \infty} \int_b^{\infty} f_N(x) x dx = 0$ for $b >a$.
$\bullet Origin$ of the problem comes from that $f_N$'s are probability measures, ($P(X_n >x)$), satisfying a large deviation principle with good rate function $g(x)$ with the above properties, and I want to show that $\mathbb{E} [ X_n \mathbb{I}\{ X_n > b \} ]$ converges to 0 as $N \to \infty$.
My first question is that, what is the interpretation of $ \liminf_{x \to \infty} \frac{V(x)}{\beta \log(x) } >1$? can we deduce that $V(x) > \beta \log(x)$?
Assuming $V(x) > \beta \log(x)$ for all $x$, here is what I have tried so far:
First note that we have $g(x) \geq \beta' \log(x) + C$ for positive constants $\beta', C$. From $\limsup_{N \to \infty} \frac{1}{N} \log f_N(x) = -g(x)$, given $0<\epsilon <C$, for large enough $N$,we have $f_N(x) \leq e^{-N(g(x) - \epsilon)}$. So, we have: $$ \int_b^{\infty} f_N(x) x dx \leq \int_b^{\infty} e^{-N(g(x) - \epsilon)} x dx \leq e^{-N(C-\epsilon)} \int_b^{\infty} e^{-N \beta' \log(x)} x dx = e^{-N(C-\epsilon)} \frac {b^{2 - N \beta'}}{N \beta' -2} \hspace{5mm} \text{for sufficiently large} N $$ taking the limit, we have the result.
I'm not sure if it is correct or not. Any help is appreciated!
Without uniformity on $x$, it is not necessary true. Consider $$f_N(x) = \begin{cases} \exp(-N\cdot g(x)), x < N\\ \text{anything}, x \geq N \end{cases}$$
Then not just $\liminf$, but even $\lim\limits_{N\to \infty} \frac{1}{N} f_N(x) = -g(x)$, but $\int_b^\infty f_N(x)\, dx$ can be made arbitrary for each $N$ independently.
Assuming uniformity in the first limit ($\forall \varepsilon > 0 \exists N_0 \forall x: \frac{1}{N}\log f_N(x) < -g(x) + \varepsilon$) - that's what needed for $f_N(x) \leq e^{-N(g(x) - \epsilon)}$ you are almost correct.
$\liminf\limits_{x \to \infty} h(x) > 1$ means exactly that for some $\varepsilon > 0$ and $x_0$, $h(x) > 1 + \varepsilon$ if $x > x_0$. So you have $g(x) \geq \beta' \log(x) + C$ if $x > x_0$. Then your reasoning shows that $\int_{x_0}^\infty f_N(x)\, dx \to 0$.
We only need to deal with $\int_b^{x_0}f_N(x)\, dx$. But $g(x)$ is bounded from $0$ on $[b, x_0]$ (as continuous positive function on closed segment), so let's say $\inf_{x \in [b, x_0]} g(x) = \delta > 0$, and as for large enough $N$, $F_N(x) \leq \exp(-N(g(x) - \delta / 2)) \leq \exp(-N \delta / 2)$, for large enough $N$, $\int_{b}^{x_0} f_N(X)\, dx \leq \exp(-N \delta / 2) \cdot (x_0 - b) \to 0$.