Let $(X_i)_{i\in\mathbb{N}}$ be an iid sequence of random variables and $S_n:=\sum_{i=1}^n X_i$. Moreover, let $M_{X_1}$ denote the moment-generating function and $\Lambda:=\log M_{X_1}$. Define $$ I(x):=\sup_{\theta\geqslant 0}\left\{\theta x-\Lambda(\theta)\right\}. $$ Last, but not least $$ D:=\left\{x: I(x)<\infty\right\}, E:=\left\{\theta: \Lambda(\theta)<\infty\right\}. $$ Cramér's Theorem says that for each $x>\mathbb{E}(X_1)=:z$ with $x\in\text{int}(D)$, we have $$ \lim_{n\to\infty}\frac{1}{n}\ln P(S_n\geqslant nx)=-I(x). $$
Intuitively, I would think that this implies that $$ P(S_n\geqslant nx)\sim e^{-nI(x)}. $$ Indeed, I have read this in some books and papers.
Others say that this is false and say that $$ P(S_n\geqslant nx)\sim e^{-nI(x)+o(n)} $$ and again others write that $$ P(S_n\geqslant nx)=\Phi(n)e^{-nI(x)}\text{ with }\log\Phi(n)\in o(n). $$
I am a bit confused. Which version is correct?
$$\lim_{n \to \infty} \dfrac{1}{n} \ln a_n = c$$ says that for any $\epsilon > 0$, we eventually have $$c - \epsilon < \dfrac{1}{n} \ln a_n < c + \epsilon$$ and thus $$ e^{n c - n \epsilon} < a_n < e^{nc + n \epsilon} $$ It is equivalent to $$ a_n = e^{nc + o(n)}$$ and if you define $\Phi(n) = a_n e^{-nc}$ you have $\Phi(n) = e^{o(n)}$, i.e. $\ln \Phi(n) = o(n)$. So your two "others" are equivalent and correct.
But it's not necessarily true that $a_n \sim e^{nc}$, which would imply that the $o(n)$ is $O(1)$.
For example, suppose $X_n \sim \mathcal N(0,1)$, so $S_n \sim \mathcal N(0, n)$. Then
$$P(S_n \ge n x) = \dfrac{1 - \text{erf}(x \sqrt{n/2})}{2}$$ and using Watson's lemma, I get $$ P(S_n \ge n x) \sim e^{-n x^2/2} \left( \dfrac{1}{x \sqrt{2\pi n}} - \dfrac{1}{x^3 \sqrt{2\pi n^3}} + \dfrac{3}{x^5 \sqrt{2\pi n^5}}\right)$$
So in this case it's certainly not $\sim e^{-n I(x)}$, but it is $e^{-n I(x) + o(n)}$ where $I(x) = -x^2/2$. Note that $$ \Phi(n) = \ln\left( \dfrac{1}{x \sqrt{2\pi n}} + \ldots \right) = - \ln(x \sqrt{2\pi}) - \dfrac{1}{2} \ln(n) + \ldots = o(n)$$