Given a random variable $X$ that is $N(0,1)$ distributed and a sequence $(X_i)$ of iid distributed $N(0,1)$ random variables(copies of $X$) and I am supposed to calculate $P(X \ge 5)$ by means of large deviations.
Hence, I calculated the rate function $\gamma^*(l) = \frac{l^2}{2}$. And now I am stuck. Is the random variable $Z:=\frac{e^{-\gamma^*(5)} }{(2\pi)^{\frac{n}{2}}}e^{- \frac{X_1^2+...+X_n^2}{n}}$ now my estimator for this event, I am a little bit confused. Basically, I want to apply this theory here : wikipedia reference
If anything is unclear, please let me know.
If $X$ is standard normal, then, when $x\to+\infty$, $$ P(X\geqslant x)\sim\frac{\mathrm e^{-x^2/2}}{x\sqrt{2\pi}}, $$ in the sense that the ratio of the LHS and the RHS converges to $1$. For $x=5$ this suggests that $P(X\geqslant5)$ might be close to $$ \frac{\mathrm e^{-12.5}}{5\sqrt{2\pi}}\approx2.97\cdot10^{-7}, $$ while the exact value is $$ P(X\geqslant5)\approx2.87\cdot10^{-7}. $$ Not an ounce of large deviations here. Large deviations in this context would yield the cruder estimate $$ P(X\geqslant x)=\exp\left(-\frac12x^2+o(x^2)\right), $$ or, equivalently, $$ \lim_{x\to+\infty}\frac{\log P(X\geqslant x)}{x^2}=-\frac12. $$